You are on page 1of 324

junk_scribd.

txt
(1961) The Compositional Matrix. Baldwin, NY: Music Teachers National Assoc.
(1962) Tonal Harmony in Concept and Practice (3rd ed., 1979). New York: Holt,
Rinehart and Winston.
(1967) SNOBOL3 Primer: An Introduction to the Computer Programming Language.
Cambridge, MA: MIT Press.
(1973) The Structure of Atonal Music. New Haven: Yale Univ. Press.
(1978) The Harmonic Organization of The Rite of Spring. New Haven: Yale Univ. Press.
(1982) Introduction to Schenkerian Analysis (with Steven E. Gilbert). New York: W.
W. Norton.
(1995) The American Popular Ballad of the Golden Era: 1924-1950. Princeton:
Princeton Univ. Press.
(1998) The Atonal Music of Anton Webern. New Haven: Yale Univ. Press.
(2001) Listening to Classic American Popular Songs. New Haven: Yale Univ. Press.
See also[edit]
Forte number
References[edit]
Jump up ^ "In memoriam Allen Forte, music theorist". news.yale.edu. October 17,
2014.
Jump up ^ http://news.yale.edu/2014/10/17/memoriam-allen-forte-music-theorist
Jump up ^ Allen Forte, "Secrets of Melody: Line and Design in the Songs of Cole
Porter," Musical Quarterly 77/4 (1993), unnumbered note on 644-645.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes
Editorship," Journal of Music Theory 50/1 (2006), 8.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes
Editorship," Journal of Music Theory 50/1 (2006), 9-10; and Berry, "Our Festschrift
for Allen: An Introduction and Conclusion," in A Music-Theoretical Matrix: Essays in
Honor of Allen Forte (Part V), ed. David Carson Berry, Gamut 6/2 (2013), 3.
Jump up ^ Allen Forte, "A Theory of Set-Complexes for Music," Journal of Music
Theory 8/2 (1964): 136-183.
Jump up ^ David Carson Berry, "Theory," sect. 5.iv (Pitch-class set theory), in
The Grove Dictionary of American Music, 2nd edition, ed. Charles Hiroshi Garrett
(New York: Oxford University Press, 2013), 8:175-176.
External links[edit]

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
ry Primer
Page 1
junk_scribd.txt
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2)
Sets of pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12
(see below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches
related by any number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6)
Pcsets must be realized (or represented or articulated) by pitches. To realize
a pcset in music, it must be ordered in pitch andin time. Every musical articulation
of a pcset produces a contour. Many different psets may represent one pcset. Pcsets
may modelmelodies, harmonies, mixed textures, etc.Definitions from Finite Set
Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a

B.(8) Inclusion: If A and B are sets and A is contained in B, we write A

B.(9) The union of two sets A and B (written A

B) is the content of both of them.(10) The intersection of two sets A and B is their
common elements (written A

B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show


the complement of A by A

.NB: A

(A and A

are d

Page 2
junk_scribd.txt

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-
versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
Page 3
junk_scribd.txt
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
Page 4
junk_scribd.txt
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

Page 5
junk_scribd.txt

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

Page 6
junk_scribd.txt

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 7
junk_scribd.txt

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
Page 8
junk_scribd.txt
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
Page 9
junk_scribd.txt
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
Page 10
junk_scribd.txt
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Page 11
junk_scribd.txt
Mutual information is more general and measures the reduction of uncertainty in Y
after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
Page 12
junk_scribd.txt
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
Page 13
junk_scribd.txt
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
Page 14
junk_scribd.txt
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
Page 15
junk_scribd.txt
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
Page 16
junk_scribd.txt
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

Page 17
junk_scribd.txt
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

Page 18
junk_scribd.txt
https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
Page 19
junk_scribd.txt
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Page 20
junk_scribd.txt
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
Page 21
junk_scribd.txt
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
Page 22
junk_scribd.txt
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
Page 23
junk_scribd.txt
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
Page 24
junk_scribd.txt
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Page 25
junk_scribd.txt
Mutual information is more general and measures the reduction of uncertainty in Y
after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
Page 26
junk_scribd.txt
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
Page 27
junk_scribd.txt
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
Page 28
junk_scribd.txt
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

Page 29
junk_scribd.txt
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 30
junk_scribd.txt
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Page 31
junk_scribd.txt
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
Page 32
junk_scribd.txt
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Page 33
junk_scribd.txt
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Page 34
junk_scribd.txt
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
Page 35
junk_scribd.txt
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
Page 36
junk_scribd.txt
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


Page 37
junk_scribd.txt
after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


Page 38
junk_scribd.txt
action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\
Page 39
junk_scribd.txt

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
Page 40
junk_scribd.txt
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
Page 41
junk_scribd.txt
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
Page 42
junk_scribd.txt
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

Page 43
junk_scribd.txt
://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 44
junk_scribd.txt
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Page 45
junk_scribd.txt
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
Page 46
junk_scribd.txt
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
Page 47
junk_scribd.txt
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


Page 48
junk_scribd.txt
after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
Page 49
junk_scribd.txt

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
Page 50
junk_scribd.txt

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 51
junk_scribd.txt

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
Page 52
junk_scribd.txt
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
Page 53
junk_scribd.txt
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
Page 54
junk_scribd.txt
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
Page 55
junk_scribd.txt
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
Page 56
junk_scribd.txt
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Page 57
junk_scribd.txt
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
Page 58
junk_scribd.txt
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
Page 59
junk_scribd.txt
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
Page 60
junk_scribd.txt
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

Page 61
junk_scribd.txt

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
Page 62
junk_scribd.txt
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
Page 63
junk_scribd.txt
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
Page 64
junk_scribd.txt

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 65
junk_scribd.txt

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
Page 66
junk_scribd.txt
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
Page 67
junk_scribd.txt
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Page 68
junk_scribd.txt
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
Page 69
junk_scribd.txt

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
Page 70
junk_scribd.txt
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
Page 71
junk_scribd.txt
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
Page 72
junk_scribd.txt
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.
Page 73
junk_scribd.txt

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
Page 74
junk_scribd.txt
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

Page 75
junk_scribd.txt

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

Page 76
junk_scribd.txt

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.
Page 77
junk_scribd.txt

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
Page 78
junk_scribd.txt
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
Page 79
junk_scribd.txt
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
Page 80
junk_scribd.txt
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
Bobs Atonal Theory Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2)
Page 81
junk_scribd.txt
Sets of pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12
(see below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches
related by any number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6)
Pcsets must be realized (or represented or articulated) by pitches. To realize
a pcset in music, it must be ordered in pitch andin time. Every musical articulation
of a pcset produces a contour. Many different psets may represent one pcset. Pcsets
may modelmelodies, harmonies, mixed textures, etc.Definitions from Finite Set
Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a

B.(8) Inclusion: If A and B are sets and A is contained in B, we write A

B.(9) The union of two sets A and B (written A

B) is the content of both of them.(10) The intersection of two sets A and B is their
common elements (written A

B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show


the complement of A by A

.NB: A

(A and A

are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
Page 82
junk_scribd.txt
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-
versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
Page 83
junk_scribd.txt
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Page 84
junk_scribd.txt
Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
Page 85
junk_scribd.txt
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Page 86
junk_scribd.txt
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
Page 87
junk_scribd.txt
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
Page 88
junk_scribd.txt
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
Page 89
junk_scribd.txt
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
Page 90
junk_scribd.txt
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\
Page 91
junk_scribd.txt

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
Page 92
junk_scribd.txt
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
Page 93
junk_scribd.txt
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
Page 94
junk_scribd.txt
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

Page 95
junk_scribd.txt
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 96
junk_scribd.txt
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Page 97
junk_scribd.txt
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
Page 98
junk_scribd.txt
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
Page 99
junk_scribd.txt
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


Page 100
junk_scribd.txt
after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
Page 101
junk_scribd.txt

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
Page 102
junk_scribd.txt

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
Page 103
junk_scribd.txt
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


Page 104
junk_scribd.txt
action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\
Page 105
junk_scribd.txt

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
Page 106
junk_scribd.txt
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
Page 107
junk_scribd.txt
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
Page 108
junk_scribd.txt
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
Page 109
junk_scribd.txt
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
Page 110
junk_scribd.txt
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
Page 111
junk_scribd.txt
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
Page 112
junk_scribd.txt
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
Page 113
junk_scribd.txt
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


Page 114
junk_scribd.txt
after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
Page 115
junk_scribd.txt

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
Page 116
junk_scribd.txt

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 117
junk_scribd.txt

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

Page 118
junk_scribd.txt
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
Page 119
junk_scribd.txt
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Page 120
junk_scribd.txt
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
Page 121
junk_scribd.txt

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
Page 122
junk_scribd.txt
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Page 123
junk_scribd.txt
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
Page 124
junk_scribd.txt
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
Page 125
junk_scribd.txt
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
Page 126
junk_scribd.txt
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

Page 127
junk_scribd.txt

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 128
junk_scribd.txt

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
Page 129
junk_scribd.txt
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
Page 130
junk_scribd.txt
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
Page 131
junk_scribd.txt
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe
Forte is well known for his book The Structure of Atonal Music (1973), which traces
many of its roots to an article of a decade earlier: "A Theory of Set-Complexes for
Music" (1964).[6] In these works, he "applied set-theoretic principles to the
analysis of unordered collections of pitch classes, called pitch-class sets (pc
sets). [...] The basic goal of Forte's theory was to define the various
relationships that existed among the relevant sets of a work, so that contextual
coherence could be demonstrated." Although the methodology derived from Fortes work
"has had its detractors ... textbooks on post-tonal analysis now routinely teach it
(to varying degrees)."[7]

Forte published analyses of the works of Webern and Berg and wrote about Schenkerian
analysis and music of the Great American Songbook. A complete, annotated
bibliography of his publications appears in the previously cited article, Berry,
"The Twin Legacies of a Scholar-Teacher." Excluding items only edited by Forte, it
lists ten books, sixty-three articles, and thirty-six other types publications, from
1955 through early 2009

Forte was also the editor of the Journal of Music Theory during an important period
Page 132
junk_scribd.txt
in its development, from volume 4/2 (1960) through 11/1 (1967). His involvement with
the journal, including many biographical details, is addressed in David Carson
Berry, "Journal of Music Theory under Allen Forte's Editorship," Journal of Music
Theory 50/1 (2006): 7-23.

Honors and Awards[edit]


He has been honored by two Festschriften (homage volumes). The first, in
commemoration of his seventieth birthday, was published in 1997 and edited by his
former students James M. Baker, David W. Beach, and Jonathan W. Bernard (FA12, FA6,
and FA11, according to Berry's list). It was titled Music Theory in Concept and
Practice (a title derived from Forte's 1962 undergraduate textbook, Tonal Harmony in
Concept and Practice). The second was serialized in five installments of Gamut: The
Journal of the Music Theory Society of the Mid-Atlantic, between 2009 and 2013. It
was edited by Forte's former student David Carson Berry (FA72) and was titled A
Music-Theoretical Matrix: Essays in Honor of Allen Forte (a title derived from
Forte's 1961 monograph, A Compositional Matrix). It included twenty-two articles by
Forte's former doctoral advisees, and three special features: a previously
unpublished article by Forte, on Gershwin songs; a collection of tributes and
reminiscences from forty-two of his former advisees; and an annotated register of
his publications and advisees.

Personal life[edit]
Forte was married to the French-born pianist Madeleine (Hsu) Forte, emerita
professor of piano at Boise State University.

Bibliography (Books only)[edit]


(1955) Contemporary Tone-Structures. New York: Bureau of Publications, Columbia
Univ. Teachers College.
note isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are
notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

Page 133
junk_scribd.txt
arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted
nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
Page 134
junk_scribd.txt
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
Page 135
junk_scribd.txt
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Page 136
junk_scribd.txt
Mutual information is more general and measures the reduction of uncertainty in Y
after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
Page 137
junk_scribd.txt
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
Page 138
junk_scribd.txt
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

Page 139
junk_scribd.txt
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

Page 140
junk_scribd.txt
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 141
junk_scribd.txt
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Page 142
junk_scribd.txt
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
Page 143
junk_scribd.txt
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

Page 144
junk_scribd.txt
A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Page 145
junk_scribd.txt
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
Page 146
junk_scribd.txt
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
Page 147
junk_scribd.txt
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Page 148
junk_scribd.txt
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
Page 149
junk_scribd.txt
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\
Page 150
junk_scribd.txt

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
Page 151
junk_scribd.txt
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


Page 152
junk_scribd.txt
thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Page 153
junk_scribd.txt
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

Page 154
junk_scribd.txt

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 155
junk_scribd.txt
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Page 156
junk_scribd.txt
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
Page 157
junk_scribd.txt
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
Page 158
junk_scribd.txt
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


Page 159
junk_scribd.txt
after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
Page 160
junk_scribd.txt

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
Page 161
junk_scribd.txt

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
Forte is well known for his book The Structure of Atonal Music (1973), which traces
many of its roots to an article of a decade earlier: "A Theory of Set-Complexes for
Music" (1964).[6] In these works, he "applied set-theoretic principles to the
analysis of unordered collections of pitch classes, called pitch-class sets (pc
sets). [...] The basic goal of Forte's theory was to define the various
relationships that existed among the relevant sets of a work, so that contextual
coherence could be demonstrated." Although the methodology derived from Fortes work
"has had its detractors ... textbooks on post-tonal analysis now routinely teach it
(to varying degrees)."[7]

Page 162
junk_scribd.txt
Forte published analyses of the works of Webern and Berg and wrote about Schenkerian
analysis and music of the Great American Songbook. A complete, annotated
bibliography of his publications appears in the previously cited article, Berry,
"The Twin Legacies of a Scholar-Teacher." Excluding items only edited by Forte, it
lists ten books, sixty-three articles, and thirty-six other types publications, from
1955 through early 2009

Forte was also the editor of the Journal of Music Theory during an important period
in its development, from volume 4/2 (1960) through 11/1 (1967). His involvement with
the journal, including many biographical details, is addressed in David Carson
Berry, "Journal of Music Theory under Allen Forte's Editorship," Journal of Music
Theory 50/1 (2006): 7-23.

Honors and Awards[edit]


He has been honored by two Festschriften (homage volumes). The first, in
commemoration of his seventieth birthday, was published in 1997 and edited by his
former students James M. Baker, David W. Beach, and Jonathan W. Bernard (FA12, FA6,
and FA11, according to Berry's list). It was titled Music Theory in Concept and
Practice (a title derived from Forte's 1962 undergraduate textbook, Tonal Harmony in
Concept and Practice). The second was serialized in five installments of Gamut: The
Journal of the Music Theory Society of the Mid-Atlantic, between 2009 and 2013. It
was edited by Forte's former student David Carson Berry (FA72) and was titled A
Music-Theoretical Matrix: Essays in Honor of Allen Forte (a title derived from
Forte's 1961 monograph, A Compositional Matrix). It included twenty-two articles by
Forte's former doctoral advisees, and three special features: a previously
unpublished article by Forte, on Gershwin songs; a collection of tributes and
reminiscences from forty-two of his former advisees; and an annotated register of
his publications and advisees.

Personal life[edit]
Forte was married to the French-born pianist Madeleine (Hsu) Forte, emerita
professor of piano at Boise State University.

Bibliography (Books only)[edit]


(1955) Contemporary Tone-Structures. New York: Bureau of Publications, Columbia
Univ. Teachers College.
(1961) The Compositional Matrix. Baldwin, NY: Music Teachers National Assoc.
(1962) Tonal Harmony in Concept and Practice (3rd ed., 1979). New York: Holt,
Rinehart and Winston.
(1967) SNOBOL3 Primer: An Introduction to the Computer Programming Language.
Cambridge, MA: MIT Press.
(1973) The Structure of Atonal Music. New Haven: Yale Univ. Press.
(1978) The Harmonic Organization of The Rite of Spring. New Haven: Yale Univ. Press.
(1982) Introduction to Schenkerian Analysis (with Steven E. Gilbert). New York: W.
W. Norton.
(1995) The American Popular Ballad of the Golden Era: 1924-1950. Princeton:
Princeton Univ. Press.
(1998) The Atonal Music of Anton Webern. New Haven: Yale Univ. Press.
Page 163
junk_scribd.txt
(2001) Listening to Classic American Popular Songs. New Haven: Yale Univ. Press.
See also[edit]
Forte number
References[edit]
Jump up ^ "In memoriam Allen Forte, music theorist". news.yale.edu. October 17,
2014.
Jump up ^ http://news.yale.edu/2014/10/17/memoriam-allen-forte-music-theorist
Jump up ^ Allen Forte, "Secrets of Melody: Line and Design in the Songs of Cole
Porter," Musical Quarterly 77/4 (1993), unnumbered note on 644-645.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes
Editorship," Journal of Music Theory 50/1 (2006), 8.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes
Editorship," Journal of Music Theory 50/1 (2006), 9-10; and Berry, "Our Festschrift
for Allen: An Introduction and Conclusion," in A Music-Theoretical Matrix: Essays in
Honor of Allen Forte (Part V), ed. David Carson Berry, Gamut 6/2 (2013), 3.
Jump up ^ Allen Forte, "A Theory of Set-Complexes for Music," Journal of Music
Theory 8/2 (1964): 136-183.
Jump up ^ David Carson Berry, "Theory," sect. 5.iv (Pitch-class set theory), in
The Grove Dictionary of American Music, 2nd edition, ed. Charles Hiroshi Garrett
(New York: Oxford University Press, 2013), 8:175-176.
External links[edit]

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
ry Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2)
Sets of pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12
(see below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches
related by any number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
Page 164
junk_scribd.txt
) are selections from the set of pcs; they are unordered in time (and pitch).(6)
Pcsets must be realized (or represented or articulated) by pitches. To realize
a pcset in music, it must be ordered in pitch andin time. Every musical articulation
of a pcset produces a contour. Many different psets may represent one pcset. Pcsets
may modelmelodies, harmonies, mixed textures, etc.Definitions from Finite Set
Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a

B.(8) Inclusion: If A and B are sets and A is contained in B, we write A

B.(9) The union of two sets A and B (written A

B) is the content of both of them.(10) The intersection of two sets A and B is their
common elements (written A

B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show


the complement of A by A

.NB: A

(A and A

are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-
versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
Page 165
junk_scribd.txt
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
Page 166
junk_scribd.txt
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
Page 167
junk_scribd.txt
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
Page 168
junk_scribd.txt

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 169
junk_scribd.txt

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
Page 170
junk_scribd.txt
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
Page 171
junk_scribd.txt
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Page 172
junk_scribd.txt
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
Page 173
junk_scribd.txt

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
Page 174
junk_scribd.txt
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
Page 175
junk_scribd.txt
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
Page 176
junk_scribd.txt
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.
Page 177
junk_scribd.txt

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
Page 178
junk_scribd.txt
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

Page 179
junk_scribd.txt

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

Page 180
junk_scribd.txt

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.
Page 181
junk_scribd.txt

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
Page 182
junk_scribd.txt
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
Page 183
junk_scribd.txt
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
Page 184
junk_scribd.txt
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Page 185
junk_scribd.txt
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
Page 186
junk_scribd.txt
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
Page 187
junk_scribd.txt

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
Page 188
junk_scribd.txt
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
Page 189
junk_scribd.txt
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
Page 190
junk_scribd.txt
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.
Page 191
junk_scribd.txt

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 192
junk_scribd.txt

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
Page 193
junk_scribd.txt
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Page 194
junk_scribd.txt
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
Page 195
junk_scribd.txt
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
Page 196
junk_scribd.txt
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
Page 197
junk_scribd.txt
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
Page 198
junk_scribd.txt
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Page 199
junk_scribd.txt
Mutual information is more general and measures the reduction of uncertainty in Y
after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
Page 200
junk_scribd.txt
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
Page 201
junk_scribd.txt
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

Page 202
junk_scribd.txt
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

Page 203
junk_scribd.txt
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 204
junk_scribd.txt
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
Page 205
junk_scribd.txt
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Page 206
junk_scribd.txt
Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
Page 207
junk_scribd.txt
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Page 208
junk_scribd.txt
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
Page 209
junk_scribd.txt
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
Page 210
junk_scribd.txt
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
Page 211
junk_scribd.txt
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
Page 212
junk_scribd.txt
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\
Page 213
junk_scribd.txt

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
Page 214
junk_scribd.txt
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
Page 215
junk_scribd.txt
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
Page 216
junk_scribd.txt
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

Page 217
junk_scribd.txt
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 218
junk_scribd.txt
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Page 219
junk_scribd.txt
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
Page 220
junk_scribd.txt
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
Page 221
junk_scribd.txt
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


Page 222
junk_scribd.txt
after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
Page 223
junk_scribd.txt

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
Page 224
junk_scribd.txt

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
Page 225
junk_scribd.txt
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


Page 226
junk_scribd.txt
action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\
Page 227
junk_scribd.txt

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
Page 228
junk_scribd.txt
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
Page 229
junk_scribd.txt
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
Page 230
junk_scribd.txt
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
Page 231
junk_scribd.txt
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
Page 232
junk_scribd.txt
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
Page 233
junk_scribd.txt
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
Page 234
junk_scribd.txt
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
Page 235
junk_scribd.txt
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


Page 236
junk_scribd.txt
after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
Page 237
junk_scribd.txt

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
Page 238
junk_scribd.txt

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 239
junk_scribd.txt

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

Page 240
junk_scribd.txt
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
Page 241
junk_scribd.txt
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Page 242
junk_scribd.txt
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
Page 243
junk_scribd.txt

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
Bobs Atonal Theory Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2)
Sets of pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12
(see below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches
related by any number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6)
Pcsets must be realized (or represented or articulated) by pitches. To realize
a pcset in music, it must be ordered in pitch andin time. Every musical articulation
of a pcset produces a contour. Many different psets may represent one pcset. Pcsets
Page 244
junk_scribd.txt
may modelmelodies, harmonies, mixed textures, etc.Definitions from Finite Set
Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a

B.(8) Inclusion: If A and B are sets and A is contained in B, we write A

B.(9) The union of two sets A and B (written A

B) is the content of both of them.(10) The intersection of two sets A and B is their
common elements (written A

B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show


the complement of A by A

.NB: A

(A and A

are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-
versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

Page 245
junk_scribd.txt

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
Page 246
junk_scribd.txt
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.
Page 247
junk_scribd.txt

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
Page 248
junk_scribd.txt
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
Page 249
junk_scribd.txt
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
Page 250
junk_scribd.txt
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
Page 251
junk_scribd.txt
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
Page 252
junk_scribd.txt
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
Page 253
junk_scribd.txt
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

Page 254
junk_scribd.txt
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

Page 255
junk_scribd.txt
httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In
thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

Page 256
junk_scribd.txt
://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.
Page 257
junk_scribd.txt

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 258
junk_scribd.txt

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
Page 259
junk_scribd.txt
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Page 260
junk_scribd.txt
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
Page 261
junk_scribd.txt
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Page 262
junk_scribd.txt
Mutual information is more general and measures the reduction of uncertainty in Y
after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
Page 263
junk_scribd.txt
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
Page 264
junk_scribd.txt
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
Page 265
junk_scribd.txt
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
Page 266
junk_scribd.txt
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
Page 267
junk_scribd.txt
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

Page 268
junk_scribd.txt
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

Page 269
junk_scribd.txt
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 270
junk_scribd.txt
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Page 271
junk_scribd.txt
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
Page 272
junk_scribd.txt
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
Page 273
junk_scribd.txt
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


Page 274
junk_scribd.txt
after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
Page 275
junk_scribd.txt
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
Page 276
junk_scribd.txt
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
Page 277
junk_scribd.txt
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
Page 278
junk_scribd.txt
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\
Page 279
junk_scribd.txt

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
Page 280
junk_scribd.txt
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 281
junk_scribd.txt
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Page 282
junk_scribd.txt
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
Page 283
junk_scribd.txt
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
Page 284
junk_scribd.txt
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


Page 285
junk_scribd.txt
nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
Page 286
junk_scribd.txt
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
Page 287
junk_scribd.txt
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


Page 288
junk_scribd.txt
after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
Page 289
junk_scribd.txt

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
Page 290
junk_scribd.txt

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
Page 291
junk_scribd.txt
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
Page 292
junk_scribd.txt
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
Page 293
junk_scribd.txt
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Page 294
junk_scribd.txt
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
Page 295
junk_scribd.txt
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
Page 296
junk_scribd.txt
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
Page 297
junk_scribd.txt
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
Page 298
junk_scribd.txt
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
Page 299
junk_scribd.txt
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
Page 300
junk_scribd.txt
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
Page 301
junk_scribd.txt

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 302
junk_scribd.txt

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
Page 303
junk_scribd.txt
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
Page 304
junk_scribd.txt
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
Page 305
junk_scribd.txt
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

Page 306
junk_scribd.txt
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
Page 307
junk_scribd.txt
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Page 308
junk_scribd.txt
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
Page 309
junk_scribd.txt

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.
Page 310
junk_scribd.txt

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
Page 311
junk_scribd.txt
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

Page 312
junk_scribd.txt

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

Page 313
junk_scribd.txt

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
Page 314
junk_scribd.txt
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Page 315
junk_scribd.txt
Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 316
junk_scribd.txt

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
Page 317
junk_scribd.txt
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
Page 318
junk_scribd.txt
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
Page 319
junk_scribd.txt
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
Page 320
junk_scribd.txt

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
Page 321
junk_scribd.txt
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
Page 322
junk_scribd.txt
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
Page 323
junk_scribd.txt
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https

Page 324

You might also like