You are on page 1of 117

C:\Users\dan\Desktop\junk_scribd.

txt Saturday, April 15, 2017 2:35 PM


lected for analysis. Then the algorithm combines the values of each dimension's suc- cessive
intervals according to a user-specified average which assigns a relative "weight" to each of
the dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For
four successive dimension values labeled A through D forming three successive unordered
intervals labeled X, Y, and Z, if the middle interval is greater than the other two intervals,
the string of values is segmented in half; the value C starts a new segment or phrase. In its
simplest form, using only one musical dimension, the algorithm works by going through the
dimension's list of un- directed intervals in threes looking for maximum values and segmenting
accordingly. This results in a series of successive segments (or phrases). We can then average
the values in each of the output segments to get a series of new higher- order values. We input
these into the algorithm to produce a second-order segmentation and so forth, until the music
is parsed into a single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on
the Schoenberg piece. Example 21a shows the results using one dimension -pitch alone. The first
pass segments the pitches into segments of three to six pitches; that is, the seg- mentation is
determined by the sequence of the sizes of suc- cessive unordered pitch intervals. The
segmental boundaries In essence, the algorithm looks at a string of intervals derived from the
successive values in some musical dimension in a piece of music. The string might be a series
of pitch intervals, time intervals (delays), dynamic changes, and so forth. More than one
string can be selected for analysis. Then the algorithm combines the values of each dimension's
suc- cessive intervals according to a user-specified average which assigns a relative "weight"
to each of the dimensions. Example 20 illustrates the principle of the Tenney/ Polansky
algorithm: For four successive dimension values labeled A through D forming three successive
unordered intervals labeled X, Y, and Z, if the middle interval is greater than the other two
intervals, the string of values is segmented in half; the value C starts a new segment or
phrase. In its simplest form, using only one musical dimension, the algorithm works by going
through the dimension's list of un- directed intervals in threes looking for maximum values and
segmenting accordingly. This results in a series of successive segments (or phrases). We can
then average the values in each of the output segments to get a series of new higher- order
values. We input these into the algorithm to produce a second-order segmentation and so forth,
until the music is parsed into a single segment. To illustrate the Tenney/Polansky Algorithm,
we perform it on the Schoenberg piece. Example 21a shows the results using one dimension -pitch
alone. The first pass segments the pitches into segments of three to six pitches; that is, the
seg- mentation is determined by the sequence of the sizes of suc- cessive unordered pitch
intervals. The segmental boundaries In essence, the algorithm looks at a string of intervals
derived from the successive values in some musical dimension in a piece of music. The string
might be a series of pitch intervals, time intervals (dEIn essence, the algorithm looks at a
string of intervals derived from the successive values in some musical dimension in a piece of
music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then xample 18. Primes enclosed in rectangles Example
18. Primes enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18. Primes
enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in
rectangles Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in rectangles
Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18.
Primes enclosed in rectangles
<0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <1 0 3
2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <
131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131
402> < 131 402> < 131 402>
In essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest

-1-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- order values. We input these into
the algorithm to produce a second-order segmentation and so forth, until the music is parsed
into a single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the
Schoenberg piece. Example 21a shows the results using one dimension -pitch alone. The first
pass segments the pitches into segments of three to six pitches; that is, the seg- mentation is
determined by the sequence of the sizes of suc- cessive unordered pitch intervals. The
segmental boundaries In essence, the algorithm looks at a string of intervals derived from the
successive values in some musical dimension in a piece of music. The string might be a series
of pitch intervals, time intervals (delays), dynamic changes, and so forth. More than one
string can be selected for analysis. Then the algorithm combines the values of each dimension's
suc- cessive intervals according to a user-specified average which assigns a relative "weight"
to each of the dimensions. Example 20 illustrates the principle of the Tenney/ Polansky
algorithm: For four successive dimension values labeled A through D forming three successive
unordered intervals labeled X, Y, and Z, if the middle interval is greater than the other two
intervals, the string of values is segmented in half; the value C starts a new segment or
phrase. In its simplest form, using only one musical dimension, the algorithm works by going
through the dimension's list of un- directed intervals in threes looking for maximum values and
segmenting accordingly. This results in a series of successive segments (or phrases). We can
then average the values in each of the output segments to get a series of new higher- order
values. We input these into the algorithm to produce a second-order segmentation and so forth,
until the music is parsed into a single segment. To illustrate the Tenney/Polansky Algorithm,
we perform it on the Schoenberg piece. Example 21a shows the results using one dimension -pitch
alone. The first pass segments the pitches into segments of three to six pitches; that is, the
seg- mentation is determined by the sequence of the sizes of suc- cessive unordered pitch
intervals. The segmental boundaries In essence, the algorithm looks at a string of intervals
derived from the successive values in some musical dimension in a piece of music. The string
might be a series of pitch intervals, time intervals (delays), dynamic changes, and so forth.
More than one string can be selected for analysis. Then the algorithm combines the values of
each dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ P

olansky algorithm: For four successive dimension values labeled A through D forming three
successive unordered intervals labeled X, Y, and Z, if the middle interval is greater than the
other two intervals, the string of values is segmented in half; the value C starts a new
segment or phrase. In its simplest form, using only one musical dimension, the algorithm works
by going through the dimension's list of un- directed intervals in threes looking for maximum
values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then average the values in each of the output segments to get a series of new
higher- order values. We input these into the algorithm to produce a second-order segmentation
and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the

-2-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More than one string can be selected for analysis.
Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- or

der values. We input these into the algorithm to produce a second-order segmentation and so
forth, until the music is parsed into a single segment. To illustrate the Tenney/Polansky
Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results using one
dimension -pitch alone. The first pass segments the pitches into segments of three to six
pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc- cessive
unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at a string
of intervals derived from the successive values in some musical dimension in a piece of music.
The string might be a series of pitch intervals, time intervals (delays), dynamic changes, and
so forth. More than one string can be selected for analysis. Then the algorithm combines the
values of each dimension's suc- cessive intervals according to a user-specified average which
assigns a relative "weight" to each of the dimensions. Example 20 illustrates the principle of
the Tenney/ Polansky algorithm: For four successive dimension values labeled A through D
forming three successive unordered intervals labeled X, Y, and Z, if the middle interval is
greater than the other two intervals, the string of values is segmented in half; the value C
starts a new segment or phrase. In its simplest form, using only one musical dimension, the
algorithm works by going through the dimension's list of un- directed intervals in threes
looking for maximum values and segmenting accordingly. This results in a series of successive
segments (or phrases). We can then average the values in each of the output segments to get a
series of new higher- order values. We input these into the algorithm to produce a second-order
segmentation and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals

-3-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More

than one string can be selected for analysis. Then the algorithm combines the values of each
dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ Polansky algorithm: For four successive dimension values labeled A through D forming
three successive unordered intervals labeled X, Y, and Z, if the middle interval is greater
than the other two intervals, the string of values is segmented in half; the value C starts a
new segment or phrase. In its simplest form, using only one musical dimension, the algorithm
works by going through the dimension's list of un- directed intervals in threes looking for
maximum values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then average the values in each of the output segments to get a series of new
higher- order values. We input these into the algorithm to produce a second-order segmentation
and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More than one string can be selected for analysis.
Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimensi

on values labeled A through D forming three successive unordered intervals labeled X, Y, and Z,
if the middle interval is greater than the other two intervals, the string of values is
segmented in half; the value C starts a new segment or phrase. In its simplest form, using only
one musical dimension, the algorithm works by going through the dimension's list of un-
directed intervals in threes looking for maximum values and segmenting accordingly. This
results in a series of successive segments (or phrases). We can then average the values in each
of the output segments to get a series of new higher- order values. We input these into the
algorithm to produce a second-order segmentation and so forth, until the music is parsed into a
single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg
piece. Example 21a shows the results using one dimension -pitch alone. The first pass segments
the pitches into segments of three to six pitches; that is, the seg- mentation is determined by
the sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries
are shown by vertical lines. The results are quite reasonable. For instance, the four pitches

-4-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
<E, Ft, G, F> in phrase 2 are segmented out of the rest of the measure since they fall in a
lower register from the others. Phrases 4 and 5 seem seg- mented correctly; the first is
divided into two segments, the second into one. And in general, the boundaries of these
first-level segments never contradict our more intuitively de- rived phrase structure. The
second pass works on the aver- ages of the values in each level-1 segment. These averages are
simply the center pitch of the bandwidth (measured in semitones) of each level-1 segment. The
intervals between the series of bandwidths forming level 2 are the input to the second pass of
the algorithm. The resulting second-level seg- mentation divides the piece in half in the
middle of the third phrase, which contradicts our six-phrase structure. That the second-pass
parsing is at variance with our phrase structure is not an embarrassment, for we are taking
pitch intervals as the only criterion for segmentation. Let us ex- amine Example 21b with the
algorithm's taking only time spans between notes as input. Here the unit of time is a
thirty-second note. Once again the first level basically con- forms to our ideas of the phrase
structure, with two excep- tions. Likewise, the second pass partitions the stream of dura-
tions so that it has an exception inherited from level 1; the last phrase is divided in half,
with its first part serving as a conclusion to the second-level segment that starts at phrase
4. Finally, Example 21c shows the algorithm's output using both duration and pitch. The initial
values of the previous examples are simply added together. This time the results get

(1961) The Compositional Matrix. Baldwin, NY: Music Teachers National Assoc.
(1962) Tonal Harmony in Concept and Practice (3rd ed., 1979). New York: Holt, Rinehart and
Winston.
(1967) SNOBOL3 Primer: An Introduction to the Computer Programming Language. Cambridge, MA: MIT
Press.
(1973) The Structure of Atonal Music. New Haven: Yale Univ. Press.
(1978) The Harmonic Organization of The Rite of Spring. New Haven: Yale Univ. Press.
(1982) Introduction to Schenkerian Analysis (with Steven E. Gilbert). New York: W. W. Norton.
(1995) The American Popular Ballad of the Golden Era: 1924-1950. Princeton: Princeton Univ.
Press.
(1998) The Atonal Music of Anton Webern. New Haven: Yale Univ. Press.
(2001) Listening to Classic American Popular Songs. New Haven: Yale Univ. Press.
See also[edit]
Forte number
References[edit]
Jump up ^ "In memoriam Allen Forte, music theorist". news.yale.edu. October 17, 2014.
Jump up ^ http://news.yale.edu/2014/10/17/memoriam-allen-forte-music-theorist
Jump up ^ Allen Forte, "Secrets of Melody: Line and Design in the Songs of Cole Porter,"
Musical Quarterly 77/4 (1993), unnumbered note on 644-645.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 8.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 9-10; and Berry, "Our Festschrift for Allen: An Introduction and
Conclusion," in A Music-Theoretical Matrix: Essays in Honor of Allen Forte (Part V), ed. David
Carson Berry, Gamut 6/2 (2013), 3.
Jump up ^ Allen Forte, "A Theory of Set-Complexes for Music," Journal of Music Theory 8/2
(1964): 136-183.
Jump up ^ David Carson Berry, "Theory," sect. 5.iv (Pitch-class set theory), in The Grove
Dictionary of American Music, 2nd edition, ed. Charles Hiroshi Garrett (New York: Oxford
University Press, 2013), 8:175-176.
External links[edit]

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance
ry Primer
1
Pitch and pitch-class (pc)(1)

-5-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-6-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that

-7-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-8-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-9-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar

-10-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-11-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-12-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

-13-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-14-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-15-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-16-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).

-17-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).

-18-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).

-19-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that

-20-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-21-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-22-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-23-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-24-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-25-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They

-26-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for

-27-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not

-28-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not

-29-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-30-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-31-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-32-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand

-33-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are

-34-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are

-35-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are

-36-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-37-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-38-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-39-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a

-40-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-41-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-42-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for

-43-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not

-44-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not

-45-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto

-46-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-47-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-48-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-49-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are

-50-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are

-51-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are

-52-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

-53-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https

-54-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https

-55-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a

-56-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-57-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-58-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
Bobs Atonal Theory Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called

-59-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note

-60-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of

-61-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are

-62-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are

-63-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a

-64-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-65-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-66-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave

-67-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-68-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-69-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-70-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not

-71-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not

-72-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not

-73-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch

-74-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-75-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-76-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of

-77-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are

-78-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are

-79-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and

-80-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

-81-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https

-82-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https

-83-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-84-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-85-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-86-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch

-87-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability

-88-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability

-89-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability

-90-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-91-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-92-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand

-93-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe
Forte is well known for his book The Structure of Atonal Music (1973), which traces many of its
roots to an article of a decade earlier: "A Theory of Set-Complexes for Music" (1964).[6] In
these works, he "applied set-theoretic principles to the analysis of unordered collections of
pitch classes, called pitch-class sets (pc sets). [...] The basic goal of Forte's theory was to
define the various relationships that existed among the relevant sets of a work, so that
contextual coherence could be demonstrated." Although the methodology derived from Fortes work
"has had its detractors ... textbooks on post-tonal analysis now routinely teach it (to varying
degrees)."[7]

Forte published analyses of the works of Webern and Berg and wrote about Schenkerian analysis
and music of the Great American Songbook. A complete, annotated bibliography of his
publications appears in the previously cited article, Berry, "The Twin Legacies of a
Scholar-Teacher." Excluding items only edited by Forte, it lists ten books, sixty-three
articles, and thirty-six other types publications, from 1955 through early 2009

Forte was also the editor of the Journal of Music Theory during an important period in its
development, from volume 4/2 (1960) through 11/1 (1967). His involvement with the journal,
including many biographical details, is addressed in David Carson Berry, "Journal of Music
Theory under Allen Forte's Editorship," Journal of Music Theory 50/1 (2006): 7-23.

Honors and Awards[edit]


He has been honored by two Festschriften (homage volumes). The first, in commemoration of his
seventieth birthday, was published in 1997 and edited by his former students James M. Baker,
David W. Beach, and Jonathan W. Bernard (FA12, FA6, and FA11, according to Berry's list). It
was titled Music Theory in Concept and Practice (a title derived from Forte's 1962
undergraduate textbook, Tonal Harmony in Concept and Practice). The second was serialized in
five installments of Gamut: The Journal of the Music Theory Society of the Mid-Atlantic,
between 2009 and 2013. It was edited by Forte's former student David Carson Berry (FA72) and
was titled A Music-Theoretical Matrix: Essays in Honor of Allen Forte (a title derived from
Forte's 1961 monograph, A Compositional Matrix). It included twenty-two articles by Forte's
former doctoral advisees, and three special features: a previously unpublished article by
Forte, on Gershwin songs; a collection of tributes and reminiscences from forty-two of his
former advisees; and an annotated register of his publications and advisees.

Personal life[edit]
Forte was married to the French-born pianist Madeleine (Hsu) Forte, emerita professor of piano
at Boise State University.

Bibliography (Books only)[edit]


(1955) Contemporary Tone-Structures. New York: Bureau of Publications, Columbia Univ. Teachers
College.
note isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has

-94-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829

-95-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829

-96-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829

-97-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the

-98-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the

-99-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the

-100-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,

-101-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-102-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-103-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch

-104-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-105-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-106-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between

-107-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-108-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-109-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-110-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829

-111-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829

-112-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829

-113-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
Forte is well known for his book The Structure of Atonal Music (1973), which traces many of its
roots to an article of a decade earlier: "A Theory of Set-Complexes for Music" (1964).[6] In
these works, he "applied set-theoretic principles to the analysis of unordered collections of
pitch classes, called pitch-class sets (pc sets). [...] The basic goal of Forte's theory was to
define the various relationships that existed among the relevant sets of a work, so that
contextual coherence could be demonstrated." Although the methodology derived from Fortes work
"has had its detractors ... textbooks on post-tonal analysis now routinely teach it (to varying
degrees)."[7]

Forte published analyses of the works of Webern and Berg and wrote about Schenkerian analysis
and music of the Great American Songbook. A complete, annotated bibliography of his
publications appears in the previously cited article, Berry, "The Twin Legacies of a
Scholar-Teacher." Excluding items only edited by Forte, it lists ten books, sixty-three
articles, and thirty-six other types publications, from 1955 through early 2009

Forte was also the editor of the Journal of Music Theory during an important period in its
development, from volume 4/2 (1960) through 11/1 (1967). His involvement with the journal,
including many biographical details, is addressed in David Carson Berry, "Journal of Music
Theory under Allen Forte's Editorship," Journal of Music Theory 50/1 (2006): 7-23.

Honors and Awards[edit]


He has been honored by two Festschriften (homage volumes). The first, in commemoration of his
seventieth birthday, was published in 1997 and edited by his former students James M. Baker,
David W. Beach, and Jonathan W. Bernard (FA12, FA6, and FA11, according to Berry's list). It
was titled Music Theory in Concept and Practice (a title derived from Forte's 1962
undergraduate textbook, Tonal Harmony in Concept and Practice). The second was serialized in
five installments of Gamut: The Journal of the Music Theory Society of the Mid-Atlantic,
between 2009 and 2013. It was edited by Forte's former student David Carson Berry (FA72) and
was titled A Music-Theoretical Matrix: Essays in Honor of Allen Forte (a title derived from
Forte's 1961 monograph, A Compositional Matrix). It included twenty-two articles by Forte's
former doctoral advisees, and three special features: a previously unpublished article by
Forte, on Gershwin songs; a collection of tributes and reminiscences from forty-two of his
former advisees; and an annotated register of his publications and advisees.

Personal life[edit]
Forte was married to the French-born pianist Madeleine (Hsu) Forte, emerita professor of piano
at Boise State University.

Bibliography (Books only)[edit]


(1955) Contemporary Tone-Structures. New York: Bureau of Publications, Columbia Univ. Teachers
College.
(1961) The Compositional Matrix. Baldwin, NY: Music Teachers National Assoc.
(1962) Tonal Harmony in Concept and Practice (3rd ed., 1979). New York: Holt, Rinehart and
Winston.
(1967) SNOBOL3 Primer: An Introduction to the Computer Programming Language. Cambridge, MA: MIT
Press.

-114-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
(1973) The Structure of Atonal Music. New Haven: Yale Univ. Press.
(1978) The Harmonic Organization of The Rite of Spring. New Haven: Yale Univ. Press.
(1982) Introduction to Schenkerian Analysis (with Steven E. Gilbert). New York: W. W. Norton.
(1995) The American Popular Ballad of the Golden Era: 1924-1950. Princeton: Princeton Univ.
Press.
(1998) The Atonal Music of Anton Webern. New Haven: Yale Univ. Press.
(2001) Listening to Classic American Popular Songs. New Haven: Yale Univ. Press.
See also[edit]
Forte number
References[edit]
Jump up ^ "In memoriam Allen Forte, music theorist". news.yale.edu. October 17, 2014.
Jump up ^ http://news.yale.edu/2014/10/17/memoriam-allen-forte-music-theorist
Jump up ^ Allen Forte, "Secrets of Melody: Line and Design in the Songs of Cole Porter,"
Musical Quarterly 77/4 (1993), unnumbered note on 644-645.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 8.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 9-10; and Berry, "Our Festschrift for Allen: An Introduction and
Conclusion," in A Music-Theoretical Matrix: Essays in Honor of Allen Forte (Part V), ed. David
Carson Berry, Gamut 6/2 (2013), 3.
Jump up ^ Allen Forte, "A Theory of Set-Complexes for Music," Journal of Music Theory 8/2
(1964): 136-183.
Jump up ^ David Carson Berry, "Theory," sect. 5.iv (Pitch-class set theory), in The Grove
Dictionary of American Music, 2nd edition, ed. Charles Hiroshi Garrett (New York: Oxford
University Press, 2013), 8:175-176.
External links[edit]

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance
ry Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A

-115-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between

-116-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, April 15, 2017 2:35 PM
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https

-117-

You might also like