You are on page 1of 10

junk_scribd.

txt
nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In
thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

Page 1
junk_scribd.txt
://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.
Page 2
junk_scribd.txt

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 3
junk_scribd.txt

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
Page 4
junk_scribd.txt
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Page 5
junk_scribd.txt
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
Page 6
junk_scribd.txt
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Page 7
junk_scribd.txt
Mutual information is more general and measures the reduction of uncertainty in Y
after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
Page 8
junk_scribd.txt
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
Page 9
junk_scribd.txt
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https

Page 10

You might also like