You are on page 1of 80

junk_scribd.

txt

Bobs Atonal Theory Primer


page 1
Bobs Atonal Theory Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2)
Sets of pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12
(see below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches
related by any number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6)
Pcsets must be realized (or represented or articulated) by pitches. To realize
a pcset in music, it must be ordered in pitch andin time. Every musical articulation
of a pcset produces a contour. Many different psets may represent one pcset. Pcsets
may modelmelodies, harmonies, mixed textures, etc.Definitions from Finite Set
Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a

B.(8) Inclusion: If A and B are sets and A is contained in B, we write A

B.(9) The union of two sets A and B (written A

B) is the content of both of them.(10) The intersection of two sets A and B is their
common elements (written A

B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show


the complement of A by A

.NB: A

(A and A
Page 1
junk_scribd.txt

are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-
versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
Page 2
junk_scribd.txt
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Page 3
junk_scribd.txt
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
Page 4
junk_scribd.txt
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
Page 5
junk_scribd.txt
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
Page 6
junk_scribd.txt
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

Page 7
junk_scribd.txt

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 8
junk_scribd.txt

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
Page 9
junk_scribd.txt
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
Page 10
junk_scribd.txt
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
Page 11
junk_scribd.txt
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

Page 12
junk_scribd.txt

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
Page 13
junk_scribd.txt
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Page 14
junk_scribd.txt
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
Page 15
junk_scribd.txt

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
Page 16
junk_scribd.txt
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
Page 17
junk_scribd.txt
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

Page 18
junk_scribd.txt

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

Page 19
junk_scribd.txt

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 20
junk_scribd.txt

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
Page 21
junk_scribd.txt
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
Page 22
junk_scribd.txt
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.
Page 23
junk_scribd.txt

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
Page 24
junk_scribd.txt
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
Page 25
junk_scribd.txt
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
Page 26
junk_scribd.txt
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
Page 27
junk_scribd.txt
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
Page 28
junk_scribd.txt
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
Page 29
junk_scribd.txt
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

Page 30
junk_scribd.txt
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

Page 31
junk_scribd.txt
httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In
thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

Page 32
junk_scribd.txt
://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.
Page 33
junk_scribd.txt

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 34
junk_scribd.txt

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
Page 35
junk_scribd.txt
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Page 36
junk_scribd.txt
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
Page 37
junk_scribd.txt
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Page 38
junk_scribd.txt
Mutual information is more general and measures the reduction of uncertainty in Y
after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
Page 39
junk_scribd.txt
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
Page 40
junk_scribd.txt
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
Page 41
junk_scribd.txt
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
Page 42
junk_scribd.txt
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
Page 43
junk_scribd.txt
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

Page 44
junk_scribd.txt
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

Page 45
junk_scribd.txt
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 46
junk_scribd.txt
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Page 47
junk_scribd.txt
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
Page 48
junk_scribd.txt
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
Page 49
junk_scribd.txt
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


Page 50
junk_scribd.txt
after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
Page 51
junk_scribd.txt
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
Page 52
junk_scribd.txt
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
Page 53
junk_scribd.txt
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
Page 54
junk_scribd.txt
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\
Page 55
junk_scribd.txt

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
Page 56
junk_scribd.txt
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 57
junk_scribd.txt
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Page 58
junk_scribd.txt
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
Page 59
junk_scribd.txt
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
Page 60
junk_scribd.txt
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


Page 61
junk_scribd.txt
nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
Page 62
junk_scribd.txt
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
Page 63
junk_scribd.txt
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


Page 64
junk_scribd.txt
after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
Page 65
junk_scribd.txt

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
Page 66
junk_scribd.txt

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
Page 67
junk_scribd.txt
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
Page 68
junk_scribd.txt
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
Page 69
junk_scribd.txt
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Page 70
junk_scribd.txt
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
Page 71
junk_scribd.txt
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
Page 72
junk_scribd.txt
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
Page 73
junk_scribd.txt
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
Page 74
junk_scribd.txt
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
Page 75
junk_scribd.txt
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
Page 76
junk_scribd.txt
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
Page 77
junk_scribd.txt

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 78
junk_scribd.txt

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
Page 79
junk_scribd.txt
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https

Page 80

You might also like