Professional Documents
Culture Documents
txt
B) is the content of both of them.(10) The intersection of two sets A and B is their
common elements (written A
.NB: A
(A and A
Page 1
junk_scribd.txt
are d
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.
Sound
://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Page 3
junk_scribd.txt
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.
Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.
A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
Page 4
junk_scribd.txt
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
Page 5
junk_scribd.txt
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
Page 7
junk_scribd.txt
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.
A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
Page 8
junk_scribd.txt
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
Page 9
junk_scribd.txt
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
Page 10
junk_scribd.txt
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
Page 11
junk_scribd.txt
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
Page 12
junk_scribd.txt
://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.
Sound
://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.
Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
Page 13
junk_scribd.txt
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.
A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Page 14
junk_scribd.txt
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
Page 15
junk_scribd.txt
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
Page 16
junk_scribd.txt
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
Page 17
junk_scribd.txt
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.
A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
Page 18
junk_scribd.txt
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
Page 19
junk_scribd.txt
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
Page 20
junk_scribd.txt
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
Page 21
junk_scribd.txt
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.
Sound
://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.
Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.
Page 23
junk_scribd.txt
A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
Page 24
junk_scribd.txt
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
Page 25
junk_scribd.txt
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
Page 26
junk_scribd.txt
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
Page 27
junk_scribd.txt
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.
A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
Page 28
junk_scribd.txt
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
Page 30
junk_scribd.txt
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
Page 31
junk_scribd.txt
httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In
thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from
Page 32
junk_scribd.txt
://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.
Sound
://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.
Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.
A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
Page 34
junk_scribd.txt
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
Page 35
junk_scribd.txt
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Page 36
junk_scribd.txt
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.
A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
Page 37
junk_scribd.txt
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
Page 38
junk_scribd.txt
Mutual information is more general and measures the reduction of uncertainty in Y
after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
Page 39
junk_scribd.txt
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
Page 40
junk_scribd.txt
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
Page 42
junk_scribd.txt
are higher and lower.
Sound
://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.
Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.
A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
Page 44
junk_scribd.txt
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
Page 45
junk_scribd.txt
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
Page 46
junk_scribd.txt
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.
A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Page 47
junk_scribd.txt
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
Page 48
junk_scribd.txt
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
Page 49
junk_scribd.txt
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.
Sound
://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
Page 52
junk_scribd.txt
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.
Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.
A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
Page 53
junk_scribd.txt
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
Page 54
junk_scribd.txt
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
Page 56
junk_scribd.txt
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.
A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
Page 57
junk_scribd.txt
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Page 58
junk_scribd.txt
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
Page 59
junk_scribd.txt
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
Page 60
junk_scribd.txt
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.
Sound
://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.
Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
Page 62
junk_scribd.txt
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.
A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
Page 63
junk_scribd.txt
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
Page 65
junk_scribd.txt
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
Page 66
junk_scribd.txt
https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.
A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
Page 67
junk_scribd.txt
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
Page 68
junk_scribd.txt
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
Page 69
junk_scribd.txt
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Page 70
junk_scribd.txt
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.
Sound
://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.
Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.
A place code, taking advantage of the tonotopy in the auditory system, must be in
Page 72
junk_scribd.txt
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
Page 73
junk_scribd.txt
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
Page 74
junk_scribd.txt
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
Page 75
junk_scribd.txt
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
Page 76
junk_scribd.txt
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.
A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
Page 77
junk_scribd.txt
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
Page 78
junk_scribd.txt
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
Page 79
junk_scribd.txt
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
https
Page 80