You are on page 1of 2881

C:\Users\dan\Desktop\junk_scribd.

txt Monday, April 17, 2017 7:32 PM


ructure of Atonal Music (1973), which traces many of its roots to an article of a decade
earlier: "A Theory of Set-Complexes for Music" (rte was also the editor of the Journal of Music
Theory during an important period in its development, from volume 4/2 (1960) through 11/1
(1967). His involvement with the journal, including many biographical details, is addressed in
David Carson Berry, "Journal of Music Theory under Allen Forte's Editorship," Journal of Music
Theory 50/1 (2006): 7-23.

Honors and Awards[edit]


He has been honored by two Festschriften (homage volumes). The first, in commemoration of his
seventieth birthday, was published in 1997 and edited by his former students James M. Baker,
David W. Beach, and Jonathan W. Bernard (FA12, FA6, and FA11, according to Berry's list). It
was titled Music Theory in Concept and Practice (a title derived from Forte's 1962
undergraduate textbook, Tonal Harmony in Concept and Practice). The second was serialized in
five installments of Gamut: The Journal of the Music Theory Society of the Mid-Atlantic,
between 2009 and 2013. It was edited by Forte's former student David Carson Berry (FA72) and
was titled A Music-Theoretical Matrix: Essays in Honor of Allen Forte (a title derived from
Forte's 1961 monograph, A Compositional Matrix). It included twenty-two articles by Forte's
former doctoral advisees, and three special features: a previously unpublished article by
Forte, on Gershwin songs; a collection of tributes and reminiscences from forty-two of his
former advisees; and an annotated register of his publications and advisees.
sp1964).[6] In these works, he "applied set-theoretic principles to the analysis of unordered
collections of pitch classes, called pitch-class sets (pc sets). [...] The basic goal of
Forte's theory was to define the various relationships that existed among the relevant sets of
a work, so that contextual coherence could be demonstrated." Although the methodology derived
from Fortes work "has had its detractors ... textbooks on post-tonal analysis now routinely
teach it (to varying degrees)."[7]

Forte published analyses of the works of Webern and Berg and wrote about Schenkerian analysis
and music of the Great American Songbook. A complete, annotated bibliography of his
publications appears in the previously cited article, Berry, "The Twin Legacies of a
Scholar-Teacher." Excluding items only edited by Forte, it lists ten books, sixty-three
articles, and thirty-six other types publications, from 1955 through early 2009

Foecified average which assigns a relative "weight" to each of the dimensions. Example 20
illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successivead the 00
Kernel Debug Guide
https://samsclass.info/126/proj/p12-kernel-debug-win10.htm
- Installing Debugging Tools for Windows
01 WIN SDK
-- Use the Windows SDK, select only the components you want to install
-- in this case, "debugging tools for windows"
- set up local kernel mode debug:
bcdedit /debug on
bcdedit /dbgsettings local
(need to reset)

02 LiveKD -
https://technet.microsoft.com/en-us/sysinternals/bb897415.aspx
If you install the tools to their default directory of \Program Files\Microsoft\Debugging Tools
for Windows, you can run LiveKD from any directory; otherwise you should copy LiveKD to the
directory in which the tools are installed
170414 - installed on drive d: - copy livekd64.exe to

-1-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64

when prompted for sym dir, enter:


D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
srv*D:\Program Files (x86)\Windows
Kits\10\Debuggers\x64\livekd64_Symbols*http://msdl.microsoft.com/download/symbols

side issue : symbol store


https://www.howtogeek.com/236195/how-to-find-out-which-build-and-version-of-windows-10-you-have/
alt-I -> about
command line:
cmd> systeminfo
cmd> ver
cmd> winver
cmd> wmic os get buildnumber,caption,CSDVersion /format:csv
cmd> wmic os get /value

other sites:
https://blogs.technet.microsoft.com/markrussinovich/2005/08/17/unkillable-processes/
https://www.windows-commandline.com/find-windows-os-version-from-command/
http://stackoverflow.com/questions/30159714/i-want-to-know-what-wmic-os-get-name-version-csdversi
on-command-returns-for-w

Debugger reference->debugger commands->Kernel-mode extension commands


https://msdn.microsoft.com/en-us/library/windows/hardware/ff564717(v=vs.85).aspx
!process

https://msdn.microsoft.com/en-us/library/windows/hardware/ff563812(v=vs.85).aspx
!irp

process explorer: configure symbols


Configure Symbols: on Windows NT and higher, if you want Process Explorer to resolve addresses
for thread start addresses in the threads tab of the process properties dialog and the thread
stack window then configure symbols by first downloading the Debugging Tools for Windows
package from Microsoft's web site and installing it in its default directory. Open the
Configure Symbols dialog and specify the path to the dbghelp.dll that's in the Debugging Tools
directory and have the symbol engine download symbols on demand from Microsoft to a directory
on your disk by entering a symbol server string for the symbol path. For example, to have
symbols download to the c:\symbols directory you would enter this string:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

https://blogs.msdn.microsoft.com/usbcoreblog/2009/10/06/why-doesnt-my-driver-unload/

running as SYSTEM:
https://forum.sysinternals.com/system-process-using-all-cpu_topic12233.html

cd C:\Users\dan\Desktop\kernel debug\02 LiveKD\PSTools


psexec -s -i -d "C:\Users\dan\Desktop\kernel debug\02 LiveKD\ProcessExplorer\procexp64.exe
psexec -s -i -d "C:\Users\dan\Downloads\processhacker-2.39-bin\x64\ProcessHacker.exe"
to configure symbols in procexp64 or processhacker:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

cd D:\Program Files (x86)\Windows Kits\10\Debuggers\x64

-2-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
d:
livekd64 -vsym
y
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
!process 0 1 notmyfault64.exe
!process 0 7 notmyfault64.exe
!irp ffff980e687e8310

lected for analysis.


Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- order values. We input these into
the algorithm to produce a second-order segmentation and so forth, until the music is parsed
into a single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the
Schoenberg piece. Example 21a shows the results using one dimension -pitch alone. The first
pass segments the pitches into segments of three to six pitches; that is, the seg- mentation is
determined by the sequence of the sizes of suc- cessive unordered pitch intervals. The
segmental boundaries In essence, the algorithm looks at a string of intervals derived from the
successive values in some musical dimension in a piece of music. The string might be a series
of pitch intervals, time intervals (dEIn essence, the algorithm looks at a string of intervals
derived from the successive values in some musical dimension in a piece of music. The string
might be a series of pitch intervals, time intervals (delays), dynamic changes, and so forth.
More than one string can be selected for analysis. Then the algorithm combines the values of
each dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ Polansky algorithm: For four successive dimension values labeled A through D forming
three successive unordered intervals labeled X, Y, and Z, if the middle interval is greater
than the other two intervals, the string of values is segmented in half; the value C starts a
new segment or phrase. In its simplest form, using only one musical dimension, the algorithm
works by going through the dimension's list of un- directed intervals in threes looking for
maximum values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then xample 18. Primes enclosed in rectangles Example 18. Primes enclosed in
rectangles Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in rectangles
Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18.
Primes enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18. Primes
enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in
rectangles
<0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <1 0 3
2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <
131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131
402> < 131 402> < 131 402>

-3-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

read the 00 Kernel Debug Guide


https://samsclass.info/126/proj/p12-kernel-debug-win10.htm
- Installing Debugging Tools for Windows
01 WIN SDK
-- Use the Windows SDK, select only the components you want to install
-- in this case, "debugging tools for windows"
- set up local kernel mode debug:
bcdedit /debug on
bcdedit /dbgsettings local
(need to reset)

02 LiveKD -
https://technet.microsoft.com/en-us/sysinternals/bb897415.aspx
If you install the tools to their default directory of \Program Files\Microsoft\Debugging Tools
for Windows, you can run LiveKD from any directory; otherwise you should copy LiveKD to the
directory in which the tools are installed
170414 - installed on drive d: - copy livekd64.exe to
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64

when prompted for sym dir, enter:


D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
srv*D:\Program Files (x86)\Windows
Kits\10\Debuggers\x64\livekd64_Symbols*http://msdl.microsoft.com/download/symbols

side issue : symbol store


https://www.howtogeek.com/236195/how-to-find-out-which-build-and-version-of-windows-10-you-have/
alt-I -> about
command line:
cmd> systeminfo
cmd> ver
cmd> winver
cmd> wmic os get buildnumber,caption,CSDVersion /format:csv
cmd> wmic os get /value

other sites:
https://blogs.technet.microsoft.com/markrussinovich/2005/08/17/unkillable-processes/
https://www.windows-commandline.com/find-windows-os-version-from-command/
http://stackoverflow.com/questions/30159714/i-want-to-know-what-wmic-os-get-name-version-csdversi
on-command-returns-for-w

Debugger reference->debugger commands->Kernel-mode extension commands


https://msdn.microsoft.com/en-us/library/windows/hardware/ff564717(v=vs.85).aspx
!process

https://msdn.microsoft.com/en-us/library/windows/hardware/ff563812(v=vs.85).aspx
!irp

process explorer: configure symbols


Configure Symbols: on Windows NT and higher, if you want Process Explorer to resolve addresses
for thread start addresses in the threads tab of the process properties dialog and the thread
stack window then configure symbols by first downloading the Debugging Tools for Windows
package from Microsoft's web site and installing it in its default directory. Open the
Configure Symbols dialog and specify the path to the dbghelp.dll that's in the Debugging Tools
directory and have the symbol engine download symbols on demand from Microsoft to a directory
on your disk by entering a symbol server string for the symbol path. For example, to have
symbols download to the c:\symbols directory you would enter this string:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

-4-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://blogs.msdn.microsoft.com/usbcoreblog/2009/10/06/why-doesnt-my-driver-unload/

running as SYSTEM:
https://forum.sysinternals.com/system-process-using-all-cpu_topic12233.html

cd C:\Users\dan\Desktop\kernel debug\02 LiveKD\PSTools


psexec -s -i -d "C:\Users\dan\Desktop\kernel debug\02 LiveKD\ProcessExplorer\procexp64.exe
psexec -s -i -d "C:\Users\dan\Downloads\processhacker-2.39-bin\x64\ProcessHacker.exe"
to configure symbols in procexp64 or processhacker:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

cd D:\Program Files (x86)\Windows Kits\10\Debuggers\x64


d:
livekd64 -vsym
y
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
!process 0 1 notmyfault64.exe
!process 0 7 notmyfault64.exe
!irp ffff980e687e8310

In essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- order values. We input these into
the algorithm to produce a second-order segmentation and so forth, until the music is parsed
into a single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the
Schoenberg piece. Example 21a shows the results using one dimension -pitch alone. The first
pass segments the pitches into segments of three to six pitches; that is, the seg- mentation is
determined by the sequence of the sizes of suc- cessive unordered pitch intervals. The
segmental boundaries In essence, the algorithm looks at a string of intervals derived from the
successive values in some musical dimension in a piece of music. The string might be a series
of pitch intervals, time intervals (delays), dynamic changes, and so forth. More than one
string can be selected for analysis. Then the algorithm combines the values of each dimension's
suc- cessive intervals according to a user-specified average which assigns a relative "weight"
to each of the dimensions. Example 20 illustrates the principle of the Tenney/ Polansky
algorithm: For four successive dimension values labeled A through D forming three successive
unordered intervals labeled X, Y, and Z, if the middle interval is greater than the other two
intervals, the string of values is segmented in half; the value C starts a new segment or
phrase. In its simplest form, using only one musical dimension, the algorithm works by going
through the dimension's list of un- directed intervals in threes looking for maximum values and
segmenting accordingly. This results in a series of successive segments (or phrases). We can
then average the values in each of the output segments to get a series of new higher- order
values. We input these into the algorithm to produce a second-order segmentation and so forth,
until the music is parsed into a single segment. To illustrate the Tenney/Polansky Algorithm,
we perform it on the Schoenberg piece. Example 21a shows the results using one dimension -pitch
alone. The first pass segments the pitches into segments of three to six pitches; that is, the
seg- mentation is determined by the sequence of the sizes of suc- cessive unordered pitch
intervals. The segmental boundaries In essence, the algorithm looks at a string of intervals
derived from the successive values in some musical dimension in a piece of music. The string
might be a series of pitch intervals, time intervals (delays), dynamic changes, and so forth.

-5-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
More than one string can be selected for analysis. Then the algorithm combines the values of
each dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ P

olansky algorithm: For four successive dimension values labeled A through D forming three
successive unordered intervals labeled X, Y, and Z, if the middle interval is greater than the
other two intervals, the string of values is segmented in half; the value C starts a new
segment or phrase. In its simplest form, using only one musical dimension, the algorithm works
by going through the dimension's list of un- directed intervals in threes looking for maximum
values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then average the values in each of the output segments to get a series of new
higher- order values. We input these into the algorithm to produce a second-order segmentation
and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More than one string can be selected for analysis.
Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in

-6-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
each of the output segments to get a series of new higher- or

der values. We input these into the algorithm to produce a second-order segmentation and so
forth, until the music is parsed into a single segment. To illustrate the Tenney/Polansky
Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results using one
dimension -pitch alone. The first pass segments the pitches into segments of three to six
pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc- cessive
unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at a string
of intervals derived from the successive values in some musical dimension in a piece of music.
The string might be a series of pitch intervals, time intervals (delays), dynamic changes, and
so forth. More than one string can be selected for analysis. Then the algorithm combines the
values of each dimension's suc- cessive intervals according to a user-specified average which
assigns a relative "weight" to each of the dimensions. Example 20 illustrates the principle of
the Tenney/ Polansky algorithm: For four successive dimension values labeled A through D
forming three successive unordered intervals labeled X, Y, and Z, if the middle interval is
greater than the other two intervals, the string of values is segmented in half; the value C
starts a new segment or phrase. In its simplest form, using only one musical dimension, the
algorithm works by going through the dimension's list of un- directed intervals in threes
looking for maximum values and segmenting accordingly. This results in a series of successive
segments (or phrases). We can then average the values in each of the output segments to get a
series of new higher- order values. We input these into the algorithm to produce a second-order
segmentation and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More

than one string can be selected for analysis. Then the algorithm combines the values of each
dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ Polansky algorithm: For four successive dimension values labeled A through D forming
three successive unordered intervals labeled X, Y, and Z, if the middle interval is greater
than the other two intervals, the string of values is segmented in half; the value C starts a
new segment or phrase. In its simplest form, using only one musical dimension, the algorithm
works by going through the dimension's list of un- directed intervals in threes looking for
maximum values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then average the values in each of the output segments to get a series of new
higher- order values. We input these into the algorithm to produce a second-order segmentation
and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic

-7-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More than one string can be selected for analysis.
Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimensi

on values labeled A through D forming three successive unordered intervals labeled X, Y, and Z,
if the middle interval is greater than the other two intervals, the string of values is
segmented in half; the value C starts a new segment or phrase. In its simplest form, using only
one musical dimension, the algorithm works by going through the dimension's list of un-
directed intervals in threes looking for maximum values and segmenting accordingly. This
results in a series of successive segments (or phrases). We can then average the values in each
of the output segments to get a series of new higher- order values. We input these into the
algorithm to produce a second-order segmentation and so forth, until the music is parsed into a
single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg
piece. Example 21a shows the results using one dimension -pitch alone. The first pass segments
the pitches into segments of three to six pitches; that is, the seg- mentation is determined by
the sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries
are shown by vertical lines. The results are quite reasonable. For instance, the four pitches
<E, Ft, G, F> in phrase 2 are segmented out of the rest of the measure since they fall in a
lower register from the others. Phrases 4 and 5 seem seg- mented correctly; the first is
divided into two segments, the second into one. And in general, the boundaries of these
first-level segments never contradict our more intuitively de- rived phrase structure. The
second pass works on the aver- ages of the values in each level-1 segment. These averages are
simply the center pitch of the bandwidth (measured in semitones) of each level-1 segment. The
intervals between the series of bandwidths forming level 2 are the input to the second pass of
the algorithm. The resulting second-level seg- mentation divides the piece in half in the
middle of the third phrase, which contradicts our six-phrase structure. That the second-pass
parsing is at variance with our phrase structure is not an embarrassment, for we are taking
pitch intervals as the only criterion for segmentation. Let us ex- amine Example 21b with the
algorithm's taking only time spans between notes as input. Here the unit of time is a
thirty-second note. Once again the first level basically con- forms to our ideas of the phrase
structure, with two excep- tions. Likewise, the second pass partitions the stream of dura-
tions so that it has an exception inherited from level 1; the last phrase is divided in half,
with its first part serving as a conclusion to the second-level segment that starts at phrase
4. Finally, Example 21c shows the algorithm's output using both duration and pitch. The initial
values of the previous examples are simply added together. This time the results get

(1961) The Compositional Matrix. Baldwin, NY: Music Teachers National Assoc.
(1962) Tonal Harmony in Concept and Practice (3rd ed., 1979). New York: Holt, Rinehart and
Winston.
(1967) SNOBOL3 Primer: An Introduction to the Computer Programming Language. Cambridge, MA: MIT
Press.
(1973) The Structure of Atonal Music. New Haven: Yale Univ. Press.
(1978) The Harmonic Organization of The Rite of Spring. New Haven: Yale Univ. Press.
(1982) Introduction to Schenkerian Analysis (with Steven E. Gilbert). New York: W. W. Norton.
(1995) The American Popular Ballad of the Golden Era: 1924-1950. Princeton: Princeton Univ.
Press.

-8-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(1998) The Atonal Music of Anton Webern. New Haven: Yale Univ. Press.
(2001) Listening to Classic American Popular Songs. New Haven: Yale Univ. Press.
See also[edit]
Forte number
References[edit]
Jump up ^ "In memoriam Allen Forte, music theorist". news.yale.edu. October 17, 2014.
Jump up ^ http://news.yale.edu/2014/10/17/memoriam-allen-forte-music-theorist
Jump up ^ Allen Forte, "Secrets of Melody: Line and Design in the Songs of Cole Porter,"
Musical Quarterly 77/4 (1993), unnumbered note on 644-645.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 8.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 9-10; and Berry, "Our Festschrift for Allen: An Introduction and
Conclusion," in A Music-Theoretical Matrix: Essays in Honor of Allen Forte (Part V), ed. David
Carson Berry, Gamut 6/2 (2013), 3.
Jump up ^ Allen Forte, "A Theory of Set-Complexes for Music," Journal of Music Theory 8/2
(1964): 136-183.
Jump up ^ David Carson Berry, "Theory," sect. 5.iv (Pitch-class set theory), in The Grove
Dictionary of American Music, 2nd edition, ed. Charles Hiroshi Garrett (New York: Oxford
University Press, 2013), 8:175-176.
External links[edit]

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance
ry Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

-9-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto

-10-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-11-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-12-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-13-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are

-14-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are

-15-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are

-16-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

-17-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https

-18-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https

-19-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a

-20-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-21-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-22-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch

-23-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

-24-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability

-25-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability

-26-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-27-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-28-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-29-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

-30-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-31-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-32-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-33-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https

-34-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https

-35-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof

-36-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and

-37-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).

-38-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).

-39-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

-40-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability

-41-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability

-42-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible

-43-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-44-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-45-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-46-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-47-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-48-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-49-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale

-50-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-51-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-52-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and

-53-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).

-54-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).

-55-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work

-56-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a

-57-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-58-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-59-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-60-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-61-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-62-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
Bobs Atonal Theory Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A
'
.NB: A
n
A
'
=

-63-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f

-64-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).

-65-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).

-66-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).

-67-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability

-68-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability

-69-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability

-70-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that

-71-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-72-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-73-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar

-74-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-75-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-76-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

-77-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-78-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-79-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-80-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).

-81-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).

-82-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).

-83-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that

-84-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-85-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-86-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-87-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-88-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-89-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They

-90-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for

-91-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not

-92-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not

-93-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-94-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-95-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-96-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe
Forte is well known for his book The Structure of Atonal Music (1973), which traces many of its
roots to an article of a decade earlier: "A Theory of Set-Complexes for Music" (1964).[6] In
these works, he "applied set-theoretic principles to the analysis of unordered collections of
pitch classes, called pitch-class sets (pc sets). [...] The basic goal of Forte's theory was to
define the various relationships that existed among the relevant sets of a work, so that
contextual coherence could be demonstrated." Although the methodology derived from Fortes work
"has had its detractors ... textbooks on post-tonal analysis now routinely teach it (to varying
degrees)."[7]

Forte published analyses of the works of Webern and Berg and wrote about Schenkerian analysis
and music of the Great American Songbook. A complete, annotated bibliography of his
publications appears in the previously cited article, Berry, "The Twin Legacies of a
Scholar-Teacher." Excluding items only edited by Forte, it lists ten books, sixty-three
articles, and thirty-six other types publications, from 1955 through early 2009

Forte was also the editor of the Journal of Music Theory during an important period in its
development, from volume 4/2 (1960) through 11/1 (1967). His involvement with the journal,
including many biographical details, is addressed in David Carson Berry, "Journal of Music
Theory under Allen Forte's Editorship," Journal of Music Theory 50/1 (2006): 7-23.

Honors and Awards[edit]


He has been honored by two Festschriften (homage volumes). The first, in commemoration of his
seventieth birthday, was published in 1997 and edited by his former students James M. Baker,
David W. Beach, and Jonathan W. Bernard (FA12, FA6, and FA11, according to Berry's list). It
was titled Music Theory in Concept and Practice (a title derived from Forte's 1962
undergraduate textbook, Tonal Harmony in Concept and Practice). The second was serialized in
five installments of Gamut: The Journal of the Music Theory Society of the Mid-Atlantic,
between 2009 and 2013. It was edited by Forte's former student David Carson Berry (FA72) and
was titled A Music-Theoretical Matrix: Essays in Honor of Allen Forte (a title derived from

-97-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Forte's 1961 monograph, A Compositional Matrix). It included twenty-two articles by Forte's
former doctoral advisees, and three special features: a previously unpublished article by
Forte, on Gershwin songs; a collection of tributes and reminiscences from forty-two of his
former advisees; and an annotated register of his publications and advisees.

Personal life[edit]
Forte was married to the French-born pianist Madeleine (Hsu) Forte, emerita professor of piano
at Boise State University.

Bibliography (Books only)[edit]


(1955) Contemporary Tone-Structures. New York: Bureau of Publications, Columbia Univ. Teachers
College.
note isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

-98-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-99-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-100-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

-101-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation

-102-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation

-103-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

-104-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-105-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-106-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-107-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known

-108-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known

-109-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known

-110-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

-111-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-112-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-113-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-114-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-115-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-116-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
Forte is well known for his book The Str
Personal life[edit]
Forte was married to the French-born pianist Madeleine (Hsu) Forte, emerita professor of piano
at Boise State University.

Bibliography (Books only)[edit]


(1955) Contemporary Tone-Structures. New York: Bureau of Publications, Columbia Univ. Teachers
College.
(1961) The Compositional Matrix. Baldwin, NY: Music Teachers National Assoc.
(1962) Tonal Harmony in Concept and Practice (3rd ed., 1979). New York: Holt, Rinehart and

-117-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Winston.
(1967) SNOBOL3 Primer: An Introduction to the Computer Programming Language. Cambridge, MA: MIT
Press.
(1973) The Structure of Atonal Music. New Haven: Yale Univ. Press.
(1978) The Harmonic Organization of The Rite of Spring. New Haven: Yale Univ. Press.
(1982) Introduction to Schenkerian Analysis (with Steven E. Gilbert). New York: W. W. Norton.
(1995) The American Popular Ballad of the Golden Era: 1924-1950. Princeton: Princeton Univ.
Press.
(1998) The Atonal Music of Anton Webern. New Haven: Yale Univ. Press.
(2001) Listening to Classic American Popular Songs. New Haven: Yale Univ. Press.
See also[edit]
Forte number
References[edit]
Jump up ^ "In memoriam Allen Forte, music theorist". news.yale.edu. October 17, 2014.
Jump up ^ http://news.yale.edu/2014/10/17/memoriam-allen-forte-music-theorist
Jump up ^ Allen Forte, "Secrets of Melody: Line and Design in the Songs of Cole Porter,"
Musical Quarterly 77/4 (1993), unnumbered note on 644-645.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 8.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 9-10; and Berry, "Our Festschrift for Allen: An Introduction and
Conclusion," in A Music-Theoretical Matrix: Essays in Honor of Allen Forte (Part V), ed. David
Carson Berry, Gamut 6/2 (2013), 3.
Jump up ^ Allen Forte, "A Theory of Set-Complexes for Music," Journal of Music Theory 8/2
(1964): 136-183.
Jump up ^ David Carson Berry, "Theory," sect. 5.iv (Pitch-class set theory), in The Grove
Dictionary of American Music, 2nd edition, ed. Charles Hiroshi Garrett (New York: Oxford
University Press, 2013), 8:175-176.
External links[edit]

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance
ry Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?

-118-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for

-119-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

-120-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

httpsructure of Atonal Music (1973), which traces many of its roots to an article of a decade
earlier: "A Theory of Set-Complexes for Music" (1964).[6] In these works, he "applied
set-theoretic principles to the analysis of unordered collections of pitch classes, called
pitch-class sets (pc sets). [...] The basic goal of Forte's theory was to define the various
relationships that existed among the relevant sets of a work, so that contextual coherence
could be demonstrated." Although the methodology derived from Fortes work "has had its
detractors ... textbooks on post-tonal analysis now routinely teach it (to varying degrees)."[7]

Forte published analyses of the works of Webern and Berg and wrote about Schenkerian analysis
and music of the Great American Songbook. A complete, annotated bibliography of his
publications appears in the previously cited article, Berry, "The Twin Legacies of a
Scholar-Teacher." Excluding items only edited by Forte, it lists ten books, sixty-three
articles, and thirty-six other types publications, from 1955 through early 2009

Forte was also the editor of the Journal of Music Theory during an important period in its
development, from volume 4/2 (1960) through 11/1 (1967). His involvement with the journal,
including many biographical details, is addressed in David Carson Berry, "Journal of Music
Theory under Allen Forte's Editorship," Journal of Music Theory 50/1 (2006): 7-23.

Honors and Awards[edit]


He has been honored by two Festschriften (homage volumes). The first, in commemoration of his
seventieth birthday, was published in 1997 and edited by his former students James M. Baker,
David W. Beach, and Jonathan W. Bernard (FA12, FA6, and FA11, according to Berry's list). It
was titled Music Theory in Concept and Practice (a title derived from Forte's 1962
undergraduate textbook, Tonal Harmony in Concept and Practice). The second was serialized in
five installments of Gamut: The Journal of the Music Theory Society of the Mid-Atlantic,
between 2009 and 2013. It was edited by Forte's former student David Carson Berry (FA72) and
was titled A Music-Theoretical Matrix: Essays in Honor of Allen Forte (a title derived from
Forte's 1961 monograph, A Compositional Matrix). It included twenty-two articles by Forte's
former doctoral advisees, and three special features: a previously unpublished article by
Forte, on Gershwin songs; a collection of tributes and reminiscences from forty-two of his
former advisees; and an annotated register of his publications and advisees.
specified average which assigns a relative "weight" to each of the dimensions. Example 20
illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successivead the 00
Kernel Debug Guide
https://samsclass.info/126/proj/p12-kernel-debug-win10.htm
- Installing Debugging Tools for Windows
01 WIN SDK
-- Use the Windows SDK, select only the components you want to install
-- in this case, "debugging tools for windows"
- set up local kernel mode debug:
bcdedit /debug on
bcdedit /dbgsettings local
(need to reset)

02 LiveKD -
https://technet.microsoft.com/en-us/sysinternals/bb897415.aspx
If you install the tools to their default directory of \Program Files\Microsoft\Debugging Tools
for Windows, you can run LiveKD from any directory; otherwise you should copy LiveKD to the
directory in which the tools are installed

-121-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
170414 - installed on drive d: - copy livekd64.exe to
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64

when prompted for sym dir, enter:


D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
srv*D:\Program Files (x86)\Windows
Kits\10\Debuggers\x64\livekd64_Symbols*http://msdl.microsoft.com/download/symbols

side issue : symbol store


https://www.howtogeek.com/236195/how-to-find-out-which-build-and-version-of-windows-10-you-have/
alt-I -> about
command line:
cmd> systeminfo
cmd> ver
cmd> winver
cmd> wmic os get buildnumber,caption,CSDVersion /format:csv
cmd> wmic os get /value

other sites:
https://blogs.technet.microsoft.com/markrussinovich/2005/08/17/unkillable-processes/
https://www.windows-commandline.com/find-windows-os-version-from-command/
http://stackoverflow.com/questions/30159714/i-want-to-know-what-wmic-os-get-name-version-csdversi
on-command-returns-for-w

Debugger reference->debugger commands->Kernel-mode extension commands


https://msdn.microsoft.com/en-us/library/windows/hardware/ff564717(v=vs.85).aspx
!process

https://msdn.microsoft.com/en-us/library/windows/hardware/ff563812(v=vs.85).aspx
!irp

process explorer: configure symbols


Configure Symbols: on Windows NT and higher, if you want Process Explorer to resolve addresses
for thread start addresses in the threads tab of the process properties dialog and the thread
stack window then configure symbols by first downloading the Debugging Tools for Windows
package from Microsoft's web site and installing it in its default directory. Open the
Configure Symbols dialog and specify the path to the dbghelp.dll that's in the Debugging Tools
directory and have the symbol engine download symbols on demand from Microsoft to a directory
on your disk by entering a symbol server string for the symbol path. For example, to have
symbols download to the c:\symbols directory you would enter this string:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

https://blogs.msdn.microsoft.com/usbcoreblog/2009/10/06/why-doesnt-my-driver-unload/

running as SYSTEM:
https://forum.sysinternals.com/system-process-using-all-cpu_topic12233.html

cd C:\Users\dan\Desktop\kernel debug\02 LiveKD\PSTools


psexec -s -i -d "C:\Users\dan\Desktop\kernel debug\02 LiveKD\ProcessExplorer\procexp64.exe
psexec -s -i -d "C:\Users\dan\Downloads\processhacker-2.39-bin\x64\ProcessHacker.exe"
to configure symbols in procexp64 or processhacker:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

-122-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
cd D:\Program Files (x86)\Windows Kits\10\Debuggers\x64
d:
livekd64 -vsym
y
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
!process 0 1 notmyfault64.exe
!process 0 7 notmyfault64.exe
!irp ffff980e687e8310

lected for analysis.


Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- order values. We input these into
the algorithm to produce a second-order segmentation and so forth, until the music is parsed
into a single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the
Schoenberg piece. Example 21a shows the results using one dimension -pitch alone. The first
pass segments the pitches into segments of three to six pitches; that is, the seg- mentation is
determined by the sequence of the sizes of suc- cessive unordered pitch intervals. The
segmental boundaries In essence, the algorithm looks at a string of intervals derived from the
successive values in some musical dimension in a piece of music. The string might be a series
of pitch intervals, time intervals (dEIn essence, the algorithm looks at a string of intervals
derived from the successive values in some musical dimension in a piece of music. The string
might be a series of pitch intervals, time intervals (delays), dynamic changes, and so forth.
More than one string can be selected for analysis. Then the algorithm combines the values of
each dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ Polansky algorithm: For four successive dimension values labeled A through D forming
three successive unordered intervals labeled X, Y, and Z, if the middle interval is greater
than the other two intervals, the string of values is segmented in half; the value C starts a
new segment or phrase. In its simplest form, using only one musical dimension, the algorithm
works by going through the dimension's list of un- directed intervals in threes looking for
maximum values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then xample 18. Primes enclosed in rectangles Example 18. Primes enclosed in
rectangles Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in rectangles
Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18.
Primes enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18. Primes
enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in
rectangles
<0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <1 0 3
2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <
131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131

-123-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
402> < 131 402> < 131 402>

read the 00 Kernel Debug Guide


https://samsclass.info/126/proj/p12-kernel-debug-win10.htm
- Installing Debugging Tools for Windows
01 WIN SDK
-- Use the Windows SDK, select only the components you want to install
-- in this case, "debugging tools for windows"
- set up local kernel mode debug:
bcdedit /debug on
bcdedit /dbgsettings local
(need to reset)

02 LiveKD -
https://technet.microsoft.com/en-us/sysinternals/bb897415.aspx
If you install the tools to their default directory of \Program Files\Microsoft\Debugging Tools
for Windows, you can run LiveKD from any directory; otherwise you should copy LiveKD to the
directory in which the tools are installed
170414 - installed on drive d: - copy livekd64.exe to
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64

when prompted for sym dir, enter:


D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
srv*D:\Program Files (x86)\Windows
Kits\10\Debuggers\x64\livekd64_Symbols*http://msdl.microsoft.com/download/symbols

side issue : symbol store


https://www.howtogeek.com/236195/how-to-find-out-which-build-and-version-of-windows-10-you-have/
alt-I -> about
command line:
cmd> systeminfo
cmd> ver
cmd> winver
cmd> wmic os get buildnumber,caption,CSDVersion /format:csv
cmd> wmic os get /value

other sites:
https://blogs.technet.microsoft.com/markrussinovich/2005/08/17/unkillable-processes/
https://www.windows-commandline.com/find-windows-os-version-from-command/
http://stackoverflow.com/questions/30159714/i-want-to-know-what-wmic-os-get-name-version-csdversi
on-command-returns-for-w

Debugger reference->debugger commands->Kernel-mode extension commands


https://msdn.microsoft.com/en-us/library/windows/hardware/ff564717(v=vs.85).aspx
!process

https://msdn.microsoft.com/en-us/library/windows/hardware/ff563812(v=vs.85).aspx
!irp

process explorer: configure symbols


Configure Symbols: on Windows NT and higher, if you want Process Explorer to resolve addresses
for thread start addresses in the threads tab of the process properties dialog and the thread
stack window then configure symbols by first downloading the Debugging Tools for Windows
package from Microsoft's web site and installing it in its default directory. Open the
Configure Symbols dialog and specify the path to the dbghelp.dll that's in the Debugging Tools
directory and have the symbol engine download symbols on demand from Microsoft to a directory
on your disk by entering a symbol server string for the symbol path. For example, to have
symbols download to the c:\symbols directory you would enter this string:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

-124-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://blogs.msdn.microsoft.com/usbcoreblog/2009/10/06/why-doesnt-my-driver-unload/

running as SYSTEM:
https://forum.sysinternals.com/system-process-using-all-cpu_topic12233.html

cd C:\Users\dan\Desktop\kernel debug\02 LiveKD\PSTools


psexec -s -i -d "C:\Users\dan\Desktop\kernel debug\02 LiveKD\ProcessExplorer\procexp64.exe
psexec -s -i -d "C:\Users\dan\Downloads\processhacker-2.39-bin\x64\ProcessHacker.exe"
to configure symbols in procexp64 or processhacker:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

cd D:\Program Files (x86)\Windows Kits\10\Debuggers\x64


d:
livekd64 -vsym
y
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
!process 0 1 notmyfault64.exe
!process 0 7 notmyfault64.exe
!irp ffff980e687e8310

In essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- order values. We input these into
the algorithm to produce a second-order segmentation and so forth, until the music is parsed
into a single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the
Schoenberg piece. Example 21a shows the results using one dimension -pitch alone. The first
pass segments the pitches into segments of three to six pitches; that is, the seg- mentation is
determined by the sequence of the sizes of suc- cessive unordered pitch intervals. The
segmental boundaries In essence, the algorithm looks at a string of intervals derived from the
successive values in some musical dimension in a piece of music. The string might be a series
of pitch intervals, time intervals (delays), dynamic changes, and so forth. More than one
string can be selected for analysis. Then the algorithm combines the values of each dimension's
suc- cessive intervals according to a user-specified average which assigns a relative "weight"
to each of the dimensions. Example 20 illustrates the principle of the Tenney/ Polansky
algorithm: For four successive dimension values labeled A through D forming three successive
unordered intervals labeled X, Y, and Z, if the middle interval is greater than the other two
intervals, the string of values is segmented in half; the value C starts a new segment or
phrase. In its simplest form, using only one musical dimension, the algorithm works by going
through the dimension's list of un- directed intervals in threes looking for maximum values and
segmenting accordingly. This results in a series of successive segments (or phrases). We can
then average the values in each of the output segments to get a series of new higher- order
values. We input these into the algorithm to produce a second-order segmentation and so forth,
until the music is parsed into a single segment. To illustrate the Tenney/Polansky Algorithm,
we perform it on the Schoenberg piece. Example 21a shows the results using one dimension -pitch
alone. The first pass segments the pitches into segments of three to six pitches; that is, the
seg- mentation is determined by the sequence of the sizes of suc- cessive unordered pitch
intervals. The segmental boundaries In essence, the algorithm looks at a string of intervals
derived from the successive values in some musical dimension in a piece of music. The string

-125-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
might be a series of pitch intervals, time intervals (delays), dynamic changes, and so forth.
More than one string can be selected for analysis. Then the algorithm combines the values of
each dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ P

olansky algorithm: For four successive dimension values labeled A through D forming three
successive unordered intervals labeled X, Y, and Z, if the middle interval is greater than the
other two intervals, the string of values is segmented in half; the value C starts a new
segment or phrase. In its simplest form, using only one musical dimension, the algorithm works
by going through the dimension's list of un- directed intervals in threes looking for maximum
values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then average the values in each of the output segments to get a series of new
higher- order values. We input these into the algorithm to produce a second-order segmentation
and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More than one string can be selected for analysis.
Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.

-126-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- or

der values. We input these into the algorithm to produce a second-order segmentation and so
forth, until the music is parsed into a single segment. To illustrate the Tenney/Polansky
Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results using one
dimension -pitch alone. The first pass segments the pitches into segments of three to six
pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc- cessive
unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at a string
of intervals derived from the successive values in some musical dimension in a piece of music.
The string might be a series of pitch intervals, time intervals (delays), dynamic changes, and
so forth. More than one string can be selected for analysis. Then the algorithm combines the
values of each dimension's suc- cessive intervals according to a user-specified average which
assigns a relative "weight" to each of the dimensions. Example 20 illustrates the principle of
the Tenney/ Polansky algorithm: For four successive dimension values labeled A through D
forming three successive unordered intervals labeled X, Y, and Z, if the middle interval is
greater than the other two intervals, the string of values is segmented in half; the value C
starts a new segment or phrase. In its simplest form, using only one musical dimension, the
algorithm works by going through the dimension's list of un- directed intervals in threes
looking for maximum values and segmenting accordingly. This results in a series of successive
segments (or phrases). We can then average the values in each of the output segments to get a
series of new higher- order values. We input these into the algorithm to produce a second-order
segmentation and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More

than one string can be selected for analysis. Then the algorithm combines the values of each
dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ Polansky algorithm: For four successive dimension values labeled A through D forming
three successive unordered intervals labeled X, Y, and Z, if the middle interval is greater
than the other two intervals, the string of values is segmented in half; the value C starts a
new segment or phrase. In its simplest form, using only one musical dimension, the algorithm
works by going through the dimension's list of un- directed intervals in threes looking for
maximum values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then average the values in each of the output segments to get a series of new
higher- order values. We input these into the algorithm to produce a second-order segmentation
and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece

-127-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More than one string can be selected for analysis.
Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimensi

on values labeled A through D forming three successive unordered intervals labeled X, Y, and Z,
if the middle interval is greater than the other two intervals, the string of values is
segmented in half; the value C starts a new segment or phrase. In its simplest form, using only
one musical dimension, the algorithm works by going through the dimension's list of un-
directed intervals in threes looking for maximum values and segmenting accordingly. This
results in a series of successive segments (or phrases). We can then average the values in each
of the output segments to get a series of new higher- order values. We input these into the
algorithm to produce a second-order segmentation and so forth, until the music is parsed into a
single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg
piece. Example 21a shows the results using one dimension -pitch alone. The first pass segments
the pitches into segments of three to six pitches; that is, the seg- mentation is determined by
the sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries
are shown by vertical lines. The results are quite reasonable. For instance, the four pitches
<E, Ft, G, F> in phrase 2 are segmented out of the rest of the measure since they fall in a
lower register from the others. Phrases 4 and 5 seem seg- mented correctly; the first is
divided into two segments, the second into one. And in general, the boundaries of these
first-level segments never contradict our more intuitively de- rived phrase structure. The
second pass works on the aver- ages of the values in each level-1 segment. These averages are
simply the center pitch of the bandwidth (measured in semitones) of each level-1 segment. The
intervals between the series of bandwidths forming level 2 are the input to the second pass of
the algorithm. The resulting second-level seg- mentation divides the piece in half in the
middle of the third phrase, which contradicts our six-phrase structure. That the second-pass
parsing is at variance with our phrase structure is not an embarrassment, for we are taking
pitch intervals as the only criterion for segmentation. Let us ex- amine Example 21b with the
algorithm's taking only time spans between notes as input. Here the unit of time is a
thirty-second note. Once again the first level basically con- forms to our ideas of the phrase
structure, with two excep- tions. Likewise, the second pass partitions the stream of dura-
tions so that it has an exception inherited from level 1; the last phrase is divided in half,
with its first part serving as a conclusion to the second-level segment that starts at phrase
4. Finally, Example 21c shows the algorithm's output using both duration and pitch. The initial
values of the previous examples are simply added together. This time the results get

(1961) The Compositional Matrix. Baldwin, NY: Music Teachers National Assoc.
(1962) Tonal Harmony in Concept and Practice (3rd ed., 1979). New York: Holt, Rinehart and
Winston.
(1967) SNOBOL3 Primer: An Introduction to the Computer Programming Language. Cambridge, MA: MIT
Press.
(1973) The Structure of Atonal Music. New Haven: Yale Univ. Press.
(1978) The Harmonic Organization of The Rite of Spring. New Haven: Yale Univ. Press.
(1982) Introduction to Schenkerian Analysis (with Steven E. Gilbert). New York: W. W. Norton.
(1995) The American Popular Ballad of the Golden Era: 1924-1950. Princeton: Princeton Univ.

-128-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Press.
(1998) The Atonal Music of Anton Webern. New Haven: Yale Univ. Press.
(2001) Listening to Classic American Popular Songs. New Haven: Yale Univ. Press.
See also[edit]
Forte number
References[edit]
Jump up ^ "In memoriam Allen Forte, music theorist". news.yale.edu. October 17, 2014.
Jump up ^ http://news.yale.edu/2014/10/17/memoriam-allen-forte-music-theorist
Jump up ^ Allen Forte, "Secrets of Melody: Line and Design in the Songs of Cole Porter,"
Musical Quarterly 77/4 (1993), unnumbered note on 644-645.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 8.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 9-10; and Berry, "Our Festschrift for Allen: An Introduction and
Conclusion," in A Music-Theoretical Matrix: Essays in Honor of Allen Forte (Part V), ed. David
Carson Berry, Gamut 6/2 (2013), 3.
Jump up ^ Allen Forte, "A Theory of Set-Complexes for Music," Journal of Music Theory 8/2
(1964): 136-183.
Jump up ^ David Carson Berry, "Theory," sect. 5.iv (Pitch-class set theory), in The Grove
Dictionary of American Music, 2nd edition, ed. Charles Hiroshi Garrett (New York: Oxford
University Press, 2013), 8:175-176.
External links[edit]

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance
ry Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n

-129-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

-130-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto
thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-131-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-132-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-133-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time

-134-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time

-135-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time

-136-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

-137-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-138-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-139-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-140-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-141-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-142-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome

-143-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for

-144-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be

-145-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be

-146-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-147-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-148-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-149-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural

-150-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-151-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-152-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-153-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-154-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-155-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-156-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof
a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by

-157-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the

-158-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the

-159-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for

-160-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be

-161-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be

-162-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s

-163-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated

-164-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated

-165-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated

-166-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-167-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-168-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-169-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)

-170-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-171-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-172-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by

-173-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the

-174-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the

-175-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor

-176-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action

-177-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a

-178-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a

-179-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated

-180-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated

-181-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated

-182-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
Bobs Atonal Theory Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A
'
.NB: A
n
A
'

-183-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand

-184-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the

-185-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the

-186-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the

-187-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be

-188-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be

-189-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be

-190-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a

-191-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated

-192-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated

-193-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds

-194-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-195-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-196-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl

-197-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-198-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-199-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-200-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the

-201-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the

-202-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the

-203-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a

-204-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a

-205-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a

-206-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a

-207-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated

-208-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated

-209-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir

-210-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

-211-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the

-212-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the

-213-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-214-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-215-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-216-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe
Forte is well known for his book The Structure of Atonal Music (1973), which traces many of its
roots to an article of a decade earlier: "A Theory of Set-Complexes for Music" (1964).[6] In
these works, he "applied set-theoretic principles to the analysis of unordered collections of
pitch classes, called pitch-class sets (pc sets). [...] The basic goal of Forte's theory was to
define the various relationships that existed among the relevant sets of a work, so that
contextual coherence could be demonstrated." Although the methodology derived from Fortes work
"has had its detractors ... textbooks on post-tonal analysis now routinely teach it (to varying
degrees)."[7]

Forte published analyses of the works of Webern and Berg and wrote about Schenkerian analysis
and music of the Great American Songbook. A complete, annotated bibliography of his
publications appears in the previously cited article, Berry, "The Twin Legacies of a
Scholar-Teacher." Excluding items only edited by Forte, it lists ten books, sixty-three
articles, and thirty-six other types publications, from 1955 through early 2009

Forte was also the editor of the Journal of Music Theory during an important period in its
development, from volume 4/2 (1960) through 11/1 (1967). His involvement with the journal,
including many biographical details, is addressed in David Carson Berry, "Journal of Music
Theory under Allen Forte's Editorship," Journal of Music Theory 50/1 (2006): 7-23.

Honors and Awards[edit]


He has been honored by two Festschriften (homage volumes). The first, in commemoration of his
seventieth birthday, was published in 1997 and edited by his former students James M. Baker,
David W. Beach, and Jonathan W. Bernard (FA12, FA6, and FA11, according to Berry's list). It
was titled Music Theory in Concept and Practice (a title derived from Forte's 1962
undergraduate textbook, Tonal Harmony in Concept and Practice). The second was serialized in
five installments of Gamut: The Journal of the Music Theory Society of the Mid-Atlantic,
between 2009 and 2013. It was edited by Forte's former student David Carson Berry (FA72) and

-217-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
was titled A Music-Theoretical Matrix: Essays in Honor of Allen Forte (a title derived from
Forte's 1961 monograph, A Compositional Matrix). It included twenty-two articles by Forte's
former doctoral advisees, and three special features: a previously unpublished article by
Forte, on Gershwin songs; a collection of tributes and reminiscences from forty-two of his
former advisees; and an annotated register of his publications and advisees.

Personal life[edit]
Forte was married to the French-born pianist Madeleine (Hsu) Forte, emerita professor of piano
at Boise State University.

Bibliography (Books only)[edit]


(1955) Contemporary Tone-Structures. New York: Bureau of Publications, Columbia Univ. Teachers
College.
note isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that

-218-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-219-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-220-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar

-221-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-222-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-223-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

-224-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-225-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-226-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-227-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).

-228-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).

-229-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).

-230-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that

-231-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-232-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-233-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-234-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-235-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-236-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
Forte is well known for his book The Str
Personal life[edit]
Forte was married to the French-born pianist Madeleine (Hsu) Forte, emerita professor of piano
at Boise State University.

Bibliography (Books only)[edit]


(1955) Contemporary Tone-Structures. New York: Bureau of Publications, Columbia Univ. Teachers
College.
(1961) The Compositional Matrix. Baldwin, NY: Music Teachers National Assoc.

-237-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(1962) Tonal Harmony in Concept and Practice (3rd ed., 1979). New York: Holt, Rinehart and
Winston.
(1967) SNOBOL3 Primer: An Introduction to the Computer Programming Language. Cambridge, MA: MIT
Press.
(1973) The Structure of Atonal Music. New Haven: Yale Univ. Press.
(1978) The Harmonic Organization of The Rite of Spring. New Haven: Yale Univ. Press.
(1982) Introduction to Schenkerian Analysis (with Steven E. Gilbert). New York: W. W. Norton.
(1995) The American Popular Ballad of the Golden Era: 1924-1950. Princeton: Princeton Univ.
Press.
(1998) The Atonal Music of Anton Webern. New Haven: Yale Univ. Press.
(2001) Listening to Classic American Popular Songs. New Haven: Yale Univ. Press.
See also[edit]
Forte number
References[edit]
Jump up ^ "In memoriam Allen Forte, music theorist". news.yale.edu. October 17, 2014.
Jump up ^ http://news.yale.edu/2014/10/17/memoriam-allen-forte-music-theorist
Jump up ^ Allen Forte, "Secrets of Melody: Line and Design in the Songs of Cole Porter,"
Musical Quarterly 77/4 (1993), unnumbered note on 644-645.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 8.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 9-10; and Berry, "Our Festschrift for Allen: An Introduction and
Conclusion," in A Music-Theoretical Matrix: Essays in Honor of Allen Forte (Part V), ed. David
Carson Berry, Gamut 6/2 (2013), 3.
Jump up ^ Allen Forte, "A Theory of Set-Complexes for Music," Journal of Music Theory 8/2
(1964): 136-183.
Jump up ^ David Carson Berry, "Theory," sect. 5.iv (Pitch-class set theory), in The Grove
Dictionary of American Music, 2nd edition, ed. Charles Hiroshi Garrett (New York: Oxford
University Press, 2013), 8:175-176.
External links[edit]

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance
ry Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a

-238-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work

-239-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a

-240-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
stimulus.

httpsructure of Atonal Music (1973), which traces many of its roots to an article of a decade
earlier: "A Theory of Set-Complexes for Music" (1964).[6] In these works, he "applied
set-theoretic principles to the analysis of unordered collections of pitch classes, called
pitch-class sets (pc sets). [...] The basic goal of Forte's theory was to define the various
relationships that existed among the relevant sets of a work, so that contextual coherence
could be demonstrated." Although the methodology derived from Fortes work "has had its
detractors ... textbooks on post-tonal analysis now routinely teach it (to varying degrees)."[7]

Forte published analyses of the works of Webern and Berg and wrote about Schenkerian analysis
and music of the Great American Songbook. A complete, annotated bibliography of his
publications appears in the previously cited article, Berry, "The Twin Legacies of a
Scholar-Teacher." Excluding items only edited by Forte, it lists ten books, sixty-three
articles, and thirty-six other types publications, from 1955 through early 2009

Forte was also the editor of the Journal of Music Theory during an important period in its
development, from volume 4/2 (1960) through 11/1 (1967). His involvement with the journal,
including many biographical details, is addressed in David Carson Berry, "Journal of Music
Theory under Allen Forte's Editorship," Journal of Music Theory 50/1 (2006): 7-23.

Honors and Awards[edit]


He has been honored by two Festschriften (homage volumes). The first, in commemoration of his
seventieth birthday, was published in 1997 and edited by his former students James M. Baker,
David W. Beach, and Jonathan W. Bernard (FA12, FA6, and FA11, according to Berry's list). It
was titled Music Theory in Concept and Practice (a title derived from Forte's 1962
undergraduate textbook, Tonal Harmony in Concept and Practice). The second was serialized in
five installments of Gamut: The Journal of the Music Theory Society of the Mid-Atlantic,
between 2009 and 2013. It was edited by Forte's former student David Carson Berry (FA72) and
was titled A Music-Theoretical Matrix: Essays in Honor of Allen Forte (a title derived from
Forte's 1961 monograph, A Compositional Matrix). It included twenty-two articles by Forte's
former doctoral advisees, and three special features: a previously unpublished article by
Forte, on Gershwin songs; a collection of tributes and reminiscences from forty-two of his
former advisees; and an annotated register of his publications and advisees.
specified average which assigns a relative "weight" to each of the dimensions. Example 20
illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successivead the 00
Kernel Debug Guide
https://samsclass.info/126/proj/p12-kernel-debug-win10.htm
- Installing Debugging Tools for Windows
01 WIN SDK
-- Use the Windows SDK, select only the components you want to install
-- in this case, "debugging tools for windows"
- set up local kernel mode debug:
bcdedit /debug on
bcdedit /dbgsettings local
(need to reset)

02 LiveKD -
https://technet.microsoft.com/en-us/sysinternals/bb897415.aspx
If you install the tools to their default directory of \Program Files\Microsoft\Debugging Tools
for Windows, you can run LiveKD from any directory; otherwise you should copy LiveKD to the

-241-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
directory in which the tools are installed
170414 - installed on drive d: - copy livekd64.exe to
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64

when prompted for sym dir, enter:


D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
srv*D:\Program Files (x86)\Windows
Kits\10\Debuggers\x64\livekd64_Symbols*http://msdl.microsoft.com/download/symbols

side issue : symbol store


https://www.howtogeek.com/236195/how-to-find-out-which-build-and-version-of-windows-10-you-have/
alt-I -> about
command line:
cmd> systeminfo
cmd> ver
cmd> winver
cmd> wmic os get buildnumber,caption,CSDVersion /format:csv
cmd> wmic os get /value

other sites:
https://blogs.technet.microsoft.com/markrussinovich/2005/08/17/unkillable-processes/
https://www.windows-commandline.com/find-windows-os-version-from-command/
http://stackoverflow.com/questions/30159714/i-want-to-know-what-wmic-os-get-name-version-csdversi
on-command-returns-for-w

Debugger reference->debugger commands->Kernel-mode extension commands


https://msdn.microsoft.com/en-us/library/windows/hardware/ff564717(v=vs.85).aspx
!process

https://msdn.microsoft.com/en-us/library/windows/hardware/ff563812(v=vs.85).aspx
!irp

process explorer: configure symbols


Configure Symbols: on Windows NT and higher, if you want Process Explorer to resolve addresses
for thread start addresses in the threads tab of the process properties dialog and the thread
stack window then configure symbols by first downloading the Debugging Tools for Windows
package from Microsoft's web site and installing it in its default directory. Open the
Configure Symbols dialog and specify the path to the dbghelp.dll that's in the Debugging Tools
directory and have the symbol engine download symbols on demand from Microsoft to a directory
on your disk by entering a symbol server string for the symbol path. For example, to have
symbols download to the c:\symbols directory you would enter this string:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

https://blogs.msdn.microsoft.com/usbcoreblog/2009/10/06/why-doesnt-my-driver-unload/

running as SYSTEM:
https://forum.sysinternals.com/system-process-using-all-cpu_topic12233.html

cd C:\Users\dan\Desktop\kernel debug\02 LiveKD\PSTools


psexec -s -i -d "C:\Users\dan\Desktop\kernel debug\02 LiveKD\ProcessExplorer\procexp64.exe
psexec -s -i -d "C:\Users\dan\Downloads\processhacker-2.39-bin\x64\ProcessHacker.exe"
to configure symbols in procexp64 or processhacker:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

-242-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

cd D:\Program Files (x86)\Windows Kits\10\Debuggers\x64


d:
livekd64 -vsym
y
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
!process 0 1 notmyfault64.exe
!process 0 7 notmyfault64.exe
!irp ffff980e687e8310

lected for analysis.


Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- order values. We input these into
the algorithm to produce a second-order segmentation and so forth, until the music is parsed
into a single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the
Schoenberg piece. Example 21a shows the results using one dimension -pitch alone. The first
pass segments the pitches into segments of three to six pitches; that is, the seg- mentation is
determined by the sequence of the sizes of suc- cessive unordered pitch intervals. The
segmental boundaries In essence, the algorithm looks at a string of intervals derived from the
successive values in some musical dimension in a piece of music. The string might be a series
of pitch intervals, time intervals (dEIn essence, the algorithm looks at a string of intervals
derived from the successive values in some musical dimension in a piece of music. The string
might be a series of pitch intervals, time intervals (delays), dynamic changes, and so forth.
More than one string can be selected for analysis. Then the algorithm combines the values of
each dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ Polansky algorithm: For four successive dimension values labeled A through D forming
three successive unordered intervals labeled X, Y, and Z, if the middle interval is greater
than the other two intervals, the string of values is segmented in half; the value C starts a
new segment or phrase. In its simplest form, using only one musical dimension, the algorithm
works by going through the dimension's list of un- directed intervals in threes looking for
maximum values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then xample 18. Primes enclosed in rectangles Example 18. Primes enclosed in
rectangles Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in rectangles
Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18.
Primes enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18. Primes
enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in
rectangles
<0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <1 0 3
2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <

-243-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131
402> < 131 402> < 131 402>

read the 00 Kernel Debug Guide


https://samsclass.info/126/proj/p12-kernel-debug-win10.htm
- Installing Debugging Tools for Windows
01 WIN SDK
-- Use the Windows SDK, select only the components you want to install
-- in this case, "debugging tools for windows"
- set up local kernel mode debug:
bcdedit /debug on
bcdedit /dbgsettings local
(need to reset)

02 LiveKD -
https://technet.microsoft.com/en-us/sysinternals/bb897415.aspx
If you install the tools to their default directory of \Program Files\Microsoft\Debugging Tools
for Windows, you can run LiveKD from any directory; otherwise you should copy LiveKD to the
directory in which the tools are installed
170414 - installed on drive d: - copy livekd64.exe to
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64

when prompted for sym dir, enter:


D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
srv*D:\Program Files (x86)\Windows
Kits\10\Debuggers\x64\livekd64_Symbols*http://msdl.microsoft.com/download/symbols

side issue : symbol store


https://www.howtogeek.com/236195/how-to-find-out-which-build-and-version-of-windows-10-you-have/
alt-I -> about
command line:
cmd> systeminfo
cmd> ver
cmd> winver
cmd> wmic os get buildnumber,caption,CSDVersion /format:csv
cmd> wmic os get /value

other sites:
https://blogs.technet.microsoft.com/markrussinovich/2005/08/17/unkillable-processes/
https://www.windows-commandline.com/find-windows-os-version-from-command/
http://stackoverflow.com/questions/30159714/i-want-to-know-what-wmic-os-get-name-version-csdversi
on-command-returns-for-w

Debugger reference->debugger commands->Kernel-mode extension commands


https://msdn.microsoft.com/en-us/library/windows/hardware/ff564717(v=vs.85).aspx
!process

https://msdn.microsoft.com/en-us/library/windows/hardware/ff563812(v=vs.85).aspx
!irp

process explorer: configure symbols


Configure Symbols: on Windows NT and higher, if you want Process Explorer to resolve addresses
for thread start addresses in the threads tab of the process properties dialog and the thread
stack window then configure symbols by first downloading the Debugging Tools for Windows
package from Microsoft's web site and installing it in its default directory. Open the
Configure Symbols dialog and specify the path to the dbghelp.dll that's in the Debugging Tools
directory and have the symbol engine download symbols on demand from Microsoft to a directory
on your disk by entering a symbol server string for the symbol path. For example, to have
symbols download to the c:\symbols directory you would enter this string:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

-244-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://blogs.msdn.microsoft.com/usbcoreblog/2009/10/06/why-doesnt-my-driver-unload/

running as SYSTEM:
https://forum.sysinternals.com/system-process-using-all-cpu_topic12233.html

cd C:\Users\dan\Desktop\kernel debug\02 LiveKD\PSTools


psexec -s -i -d "C:\Users\dan\Desktop\kernel debug\02 LiveKD\ProcessExplorer\procexp64.exe
psexec -s -i -d "C:\Users\dan\Downloads\processhacker-2.39-bin\x64\ProcessHacker.exe"
to configure symbols in procexp64 or processhacker:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

cd D:\Program Files (x86)\Windows Kits\10\Debuggers\x64


d:
livekd64 -vsym
y
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
!process 0 1 notmyfault64.exe
!process 0 7 notmyfault64.exe
!irp ffff980e687e8310

In essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- order values. We input these into
the algorithm to produce a second-order segmentation and so forth, until the music is parsed
into a single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the
Schoenberg piece. Example 21a shows the results using one dimension -pitch alone. The first
pass segments the pitches into segments of three to six pitches; that is, the seg- mentation is
determined by the sequence of the sizes of suc- cessive unordered pitch intervals. The
segmental boundaries In essence, the algorithm looks at a string of intervals derived from the
successive values in some musical dimension in a piece of music. The string might be a series
of pitch intervals, time intervals (delays), dynamic changes, and so forth. More than one
string can be selected for analysis. Then the algorithm combines the values of each dimension's
suc- cessive intervals according to a user-specified average which assigns a relative "weight"
to each of the dimensions. Example 20 illustrates the principle of the Tenney/ Polansky
algorithm: For four successive dimension values labeled A through D forming three successive
unordered intervals labeled X, Y, and Z, if the middle interval is greater than the other two
intervals, the string of values is segmented in half; the value C starts a new segment or
phrase. In its simplest form, using only one musical dimension, the algorithm works by going
through the dimension's list of un- directed intervals in threes looking for maximum values and
segmenting accordingly. This results in a series of successive segments (or phrases). We can
then average the values in each of the output segments to get a series of new higher- order
values. We input these into the algorithm to produce a second-order segmentation and so forth,
until the music is parsed into a single segment. To illustrate the Tenney/Polansky Algorithm,
we perform it on the Schoenberg piece. Example 21a shows the results using one dimension -pitch
alone. The first pass segments the pitches into segments of three to six pitches; that is, the
seg- mentation is determined by the sequence of the sizes of suc- cessive unordered pitch
intervals. The segmental boundaries In essence, the algorithm looks at a string of intervals

-245-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
derived from the successive values in some musical dimension in a piece of music. The string
might be a series of pitch intervals, time intervals (delays), dynamic changes, and so forth.
More than one string can be selected for analysis. Then the algorithm combines the values of
each dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ P

olansky algorithm: For four successive dimension values labeled A through D forming three
successive unordered intervals labeled X, Y, and Z, if the middle interval is greater than the
other two intervals, the string of values is segmented in half; the value C starts a new
segment or phrase. In its simplest form, using only one musical dimension, the algorithm works
by going through the dimension's list of un- directed intervals in threes looking for maximum
values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then average the values in each of the output segments to get a series of new
higher- order values. We input these into the algorithm to produce a second-order segmentation
and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More than one string can be selected for analysis.
Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's

-246-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- or

der values. We input these into the algorithm to produce a second-order segmentation and so
forth, until the music is parsed into a single segment. To illustrate the Tenney/Polansky
Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results using one
dimension -pitch alone. The first pass segments the pitches into segments of three to six
pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc- cessive
unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at a string
of intervals derived from the successive values in some musical dimension in a piece of music.
The string might be a series of pitch intervals, time intervals (delays), dynamic changes, and
so forth. More than one string can be selected for analysis. Then the algorithm combines the
values of each dimension's suc- cessive intervals according to a user-specified average which
assigns a relative "weight" to each of the dimensions. Example 20 illustrates the principle of
the Tenney/ Polansky algorithm: For four successive dimension values labeled A through D
forming three successive unordered intervals labeled X, Y, and Z, if the middle interval is
greater than the other two intervals, the string of values is segmented in half; the value C
starts a new segment or phrase. In its simplest form, using only one musical dimension, the
algorithm works by going through the dimension's list of un- directed intervals in threes
looking for maximum values and segmenting accordingly. This results in a series of successive
segments (or phrases). We can then average the values in each of the output segments to get a
series of new higher- order values. We input these into the algorithm to produce a second-order
segmentation and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More

than one string can be selected for analysis. Then the algorithm combines the values of each
dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ Polansky algorithm: For four successive dimension values labeled A through D forming
three successive unordered intervals labeled X, Y, and Z, if the middle interval is greater
than the other two intervals, the string of values is segmented in half; the value C starts a
new segment or phrase. In its simplest form, using only one musical dimension, the algorithm
works by going through the dimension's list of un- directed intervals in threes looking for
maximum values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then average the values in each of the output segments to get a series of new
higher- order values. We input these into the algorithm to produce a second-order segmentation
and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at

-247-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More than one string can be selected for analysis.
Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimensi

on values labeled A through D forming three successive unordered intervals labeled X, Y, and Z,
if the middle interval is greater than the other two intervals, the string of values is
segmented in half; the value C starts a new segment or phrase. In its simplest form, using only
one musical dimension, the algorithm works by going through the dimension's list of un-
directed intervals in threes looking for maximum values and segmenting accordingly. This
results in a series of successive segments (or phrases). We can then average the values in each
of the output segments to get a series of new higher- order values. We input these into the
algorithm to produce a second-order segmentation and so forth, until the music is parsed into a
single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg
piece. Example 21a shows the results using one dimension -pitch alone. The first pass segments
the pitches into segments of three to six pitches; that is, the seg- mentation is determined by
the sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries
are shown by vertical lines. The results are quite reasonable. For instance, the four pitches
<E, Ft, G, F> in phrase 2 are segmented out of the rest of the measure since they fall in a
lower register from the others. Phrases 4 and 5 seem seg- mented correctly; the first is
divided into two segments, the second into one. And in general, the boundaries of these
first-level segments never contradict our more intuitively de- rived phrase structure. The
second pass works on the aver- ages of the values in each level-1 segment. These averages are
simply the center pitch of the bandwidth (measured in semitones) of each level-1 segment. The
intervals between the series of bandwidths forming level 2 are the input to the second pass of
the algorithm. The resulting second-level seg- mentation divides the piece in half in the
middle of the third phrase, which contradicts our six-phrase structure. That the second-pass
parsing is at variance with our phrase structure is not an embarrassment, for we are taking
pitch intervals as the only criterion for segmentation. Let us ex- amine Example 21b with the
algorithm's taking only time spans between notes as input. Here the unit of time is a
thirty-second note. Once again the first level basically con- forms to our ideas of the phrase
structure, with two excep- tions. Likewise, the second pass partitions the stream of dura-
tions so that it has an exception inherited from level 1; the last phrase is divided in half,
with its first part serving as a conclusion to the second-level segment that starts at phrase
4. Finally, Example 21c shows the algorithm's output using both duration and pitch. The initial
values of the previous examples are simply added together. This time the results get

(1961) The Compositional Matrix. Baldwin, NY: Music Teachers National Assoc.
(1962) Tonal Harmony in Concept and Practice (3rd ed., 1979). New York: Holt, Rinehart and
Winston.
(1967) SNOBOL3 Primer: An Introduction to the Computer Programming Language. Cambridge, MA: MIT
Press.
(1973) The Structure of Atonal Music. New Haven: Yale Univ. Press.
(1978) The Harmonic Organization of The Rite of Spring. New Haven: Yale Univ. Press.
(1982) Introduction to Schenkerian Analysis (with Steven E. Gilbert). New York: W. W. Norton.

-248-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(1995) The American Popular Ballad of the Golden Era: 1924-1950. Princeton: Princeton Univ.
Press.
(1998) The Atonal Music of Anton Webern. New Haven: Yale Univ. Press.
(2001) Listening to Classic American Popular Songs. New Haven: Yale Univ. Press.
See also[edit]
Forte number
References[edit]
Jump up ^ "In memoriam Allen Forte, music theorist". news.yale.edu. October 17, 2014.
Jump up ^ http://news.yale.edu/2014/10/17/memoriam-allen-forte-music-theorist
Jump up ^ Allen Forte, "Secrets of Melody: Line and Design in the Songs of Cole Porter,"
Musical Quarterly 77/4 (1993), unnumbered note on 644-645.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 8.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 9-10; and Berry, "Our Festschrift for Allen: An Introduction and
Conclusion," in A Music-Theoretical Matrix: Essays in Honor of Allen Forte (Part V), ed. David
Carson Berry, Gamut 6/2 (2013), 3.
Jump up ^ Allen Forte, "A Theory of Set-Complexes for Music," Journal of Music Theory 8/2
(1964): 136-183.
Jump up ^ David Carson Berry, "Theory," sect. 5.iv (Pitch-class set theory), in The Grove
Dictionary of American Music, 2nd edition, ed. Charles Hiroshi Garrett (New York: Oxford
University Press, 2013), 8:175-176.
External links[edit]

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance
ry Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A

-249-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

-250-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-251-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-252-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-253-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known

-254-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known

-255-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known

-256-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

-257-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-258-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-259-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-260-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-261-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-262-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in

-263-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can

-264-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the

-265-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the

-266-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-267-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-268-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-269-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof

-270-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-271-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-272-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-273-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-274-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-275-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-276-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are

-277-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829

-278-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829

-279-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can

-280-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the

-281-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the

-282-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike

-283-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-284-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-285-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-286-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-287-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-288-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-289-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

-290-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-291-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-292-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are

-293-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829

-294-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829

-295-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch

-296-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

-297-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while

-298-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while

-299-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-300-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-301-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-302-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
Bobs Atonal Theory Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A
'
.NB: A
n
A

-303-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has

-304-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829

-305-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829

-306-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829

-307-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the

-308-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the

-309-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the

-310-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,

-311-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-312-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-313-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch

-314-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-315-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-316-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between

-317-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-318-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-319-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-320-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829

-321-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829

-322-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829

-323-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,

-324-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while

-325-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while

-326-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while

-327-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-328-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-329-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note

-330-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

-331-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation

-332-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation

-333-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-334-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-335-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-336-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe
Forte is well known for his book The Structure of Atonal Music (1973), which traces many of its
roots to an article of a decade earlier: "A Theory of Set-Complexes for Music" (1964).[6] In
these works, he "applied set-theoretic principles to the analysis of unordered collections of
pitch classes, called pitch-class sets (pc sets). [...] The basic goal of Forte's theory was to
define the various relationships that existed among the relevant sets of a work, so that
contextual coherence could be demonstrated." Although the methodology derived from Fortes work
"has had its detractors ... textbooks on post-tonal analysis now routinely teach it (to varying
degrees)."[7]

Forte published analyses of the works of Webern and Berg and wrote about Schenkerian analysis
and music of the Great American Songbook. A complete, annotated bibliography of his
publications appears in the previously cited article, Berry, "The Twin Legacies of a
Scholar-Teacher." Excluding items only edited by Forte, it lists ten books, sixty-three
articles, and thirty-six other types publications, from 1955 through early 2009

Forte was also the editor of the Journal of Music Theory during an important period in its
development, from volume 4/2 (1960) through 11/1 (1967). His involvement with the journal,
including many biographical details, is addressed in David Carson Berry, "Journal of Music
Theory under Allen Forte's Editorship," Journal of Music Theory 50/1 (2006): 7-23.

Honors and Awards[edit]


He has been honored by two Festschriften (homage volumes). The first, in commemoration of his
seventieth birthday, was published in 1997 and edited by his former students James M. Baker,
David W. Beach, and Jonathan W. Bernard (FA12, FA6, and FA11, according to Berry's list). It
was titled Music Theory in Concept and Practice (a title derived from Forte's 1962
undergraduate textbook, Tonal Harmony in Concept and Practice). The second was serialized in
five installments of Gamut: The Journal of the Music Theory Society of the Mid-Atlantic,

-337-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
between 2009 and 2013. It was edited by Forte's former student David Carson Berry (FA72) and
was titled A Music-Theoretical Matrix: Essays in Honor of Allen Forte (a title derived from
Forte's 1961 monograph, A Compositional Matrix). It included twenty-two articles by Forte's
former doctoral advisees, and three special features: a previously unpublished article by
Forte, on Gershwin songs; a collection of tributes and reminiscences from forty-two of his
former advisees; and an annotated register of his publications and advisees.

Personal life[edit]
Forte was married to the French-born pianist Madeleine (Hsu) Forte, emerita professor of piano
at Boise State University.

Bibliography (Books only)[edit]


(1955) Contemporary Tone-Structures. New York: Bureau of Publications, Columbia Univ. Teachers
College.
note isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a

-338-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated

-339-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated

-340-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds

-341-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-342-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-343-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl

-344-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-345-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-346-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-347-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the

-348-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the

-349-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the

-350-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a

-351-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a

-352-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a

-353-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a

-354-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated

-355-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated

-356-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
Forte is well known for his book The Str
Personal life[edit]
Forte was married to the French-born pianist Madeleine (Hsu) Forte, emerita professor of piano
at Boise State University.

Bibliography (Books only)[edit]


(1955) Contemporary Tone-Structures. New York: Bureau of Publications, Columbia Univ. Teachers
College.

-357-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(1961) The Compositional Matrix. Baldwin, NY: Music Teachers National Assoc.
(1962) Tonal Harmony in Concept and Practice (3rd ed., 1979). New York: Holt, Rinehart and
Winston.
(1967) SNOBOL3 Primer: An Introduction to the Computer Programming Language. Cambridge, MA: MIT
Press.
(1973) The Structure of Atonal Music. New Haven: Yale Univ. Press.
(1978) The Harmonic Organization of The Rite of Spring. New Haven: Yale Univ. Press.
(1982) Introduction to Schenkerian Analysis (with Steven E. Gilbert). New York: W. W. Norton.
(1995) The American Popular Ballad of the Golden Era: 1924-1950. Princeton: Princeton Univ.
Press.
(1998) The Atonal Music of Anton Webern. New Haven: Yale Univ. Press.
(2001) Listening to Classic American Popular Songs. New Haven: Yale Univ. Press.
See also[edit]
Forte number
References[edit]
Jump up ^ "In memoriam Allen Forte, music theorist". news.yale.edu. October 17, 2014.
Jump up ^ http://news.yale.edu/2014/10/17/memoriam-allen-forte-music-theorist
Jump up ^ Allen Forte, "Secrets of Melody: Line and Design in the Songs of Cole Porter,"
Musical Quarterly 77/4 (1993), unnumbered note on 644-645.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 8.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 9-10; and Berry, "Our Festschrift for Allen: An Introduction and
Conclusion," in A Music-Theoretical Matrix: Essays in Honor of Allen Forte (Part V), ed. David
Carson Berry, Gamut 6/2 (2013), 3.
Jump up ^ Allen Forte, "A Theory of Set-Complexes for Music," Journal of Music Theory 8/2
(1964): 136-183.
Jump up ^ David Carson Berry, "Theory," sect. 5.iv (Pitch-class set theory), in The Grove
Dictionary of American Music, 2nd edition, ed. Charles Hiroshi Garrett (New York: Oxford
University Press, 2013), 8:175-176.
External links[edit]

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance
ry Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

-358-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(7) Membership: If a is a member (or element) of the set B, we write a
?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor

-359-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action

-360-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

httpsructure of Atonal Music (1973), which traces many of its roots to an article of a decade
earlier: "A Theory of Set-Complexes for Music" (1964).[6] In these works, he "applied
set-theoretic principles to the analysis of unordered collections of pitch classes, called
pitch-class sets (pc sets). [...] The basic goal of Forte's theory was to define the various
relationships that existed among the relevant sets of a work, so that contextual coherence
could be demonstrated." Although the methodology derived from Fortes work "has had its
detractors ... textbooks on post-tonal analysis now routinely teach it (to varying degrees)."[7]

Forte published analyses of the works of Webern and Berg and wrote about Schenkerian analysis
and music of the Great American Songbook. A complete, annotated bibliography of his
publications appears in the previously cited article, Berry, "The Twin Legacies of a
Scholar-Teacher." Excluding items only edited by Forte, it lists ten books, sixty-three
articles, and thirty-six other types publications, from 1955 through early 2009

Forte was also the editor of the Journal of Music Theory during an important period in its
development, from volume 4/2 (1960) through 11/1 (1967). His involvement with the journal,
including many biographical details, is addressed in David Carson Berry, "Journal of Music
Theory under Allen Forte's Editorship," Journal of Music Theory 50/1 (2006): 7-23.

Honors and Awards[edit]


He has been honored by two Festschriften (homage volumes). The first, in commemoration of his
seventieth birthday, was published in 1997 and edited by his former students James M. Baker,
David W. Beach, and Jonathan W. Bernard (FA12, FA6, and FA11, according to Berry's list). It
was titled Music Theory in Concept and Practice (a title derived from Forte's 1962
undergraduate textbook, Tonal Harmony in Concept and Practice). The second was serialized in
five installments of Gamut: The Journal of the Music Theory Society of the Mid-Atlantic,
between 2009 and 2013. It was edited by Forte's former student David Carson Berry (FA72) and
was titled A Music-Theoretical Matrix: Essays in Honor of Allen Forte (a title derived from
Forte's 1961 monograph, A Compositional Matrix). It included twenty-two articles by Forte's
former doctoral advisees, and three special features: a previously unpublished article by
Forte, on Gershwin songs; a collection of tributes and reminiscences from forty-two of his
former advisees; and an annotated register of his publications and advisees.
specified average which assigns a relative "weight" to each of the dimensions. Example 20
illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successivead the 00
Kernel Debug Guide
https://samsclass.info/126/proj/p12-kernel-debug-win10.htm
- Installing Debugging Tools for Windows
01 WIN SDK
-- Use the Windows SDK, select only the components you want to install
-- in this case, "debugging tools for windows"
- set up local kernel mode debug:
bcdedit /debug on
bcdedit /dbgsettings local
(need to reset)

02 LiveKD -
https://technet.microsoft.com/en-us/sysinternals/bb897415.aspx
If you install the tools to their default directory of \Program Files\Microsoft\Debugging Tools

-361-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
for Windows, you can run LiveKD from any directory; otherwise you should copy LiveKD to the
directory in which the tools are installed
170414 - installed on drive d: - copy livekd64.exe to
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64

when prompted for sym dir, enter:


D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
srv*D:\Program Files (x86)\Windows
Kits\10\Debuggers\x64\livekd64_Symbols*http://msdl.microsoft.com/download/symbols

side issue : symbol store


https://www.howtogeek.com/236195/how-to-find-out-which-build-and-version-of-windows-10-you-have/
alt-I -> about
command line:
cmd> systeminfo
cmd> ver
cmd> winver
cmd> wmic os get buildnumber,caption,CSDVersion /format:csv
cmd> wmic os get /value

other sites:
https://blogs.technet.microsoft.com/markrussinovich/2005/08/17/unkillable-processes/
https://www.windows-commandline.com/find-windows-os-version-from-command/
http://stackoverflow.com/questions/30159714/i-want-to-know-what-wmic-os-get-name-version-csdversi
on-command-returns-for-w

Debugger reference->debugger commands->Kernel-mode extension commands


https://msdn.microsoft.com/en-us/library/windows/hardware/ff564717(v=vs.85).aspx
!process

https://msdn.microsoft.com/en-us/library/windows/hardware/ff563812(v=vs.85).aspx
!irp

process explorer: configure symbols


Configure Symbols: on Windows NT and higher, if you want Process Explorer to resolve addresses
for thread start addresses in the threads tab of the process properties dialog and the thread
stack window then configure symbols by first downloading the Debugging Tools for Windows
package from Microsoft's web site and installing it in its default directory. Open the
Configure Symbols dialog and specify the path to the dbghelp.dll that's in the Debugging Tools
directory and have the symbol engine download symbols on demand from Microsoft to a directory
on your disk by entering a symbol server string for the symbol path. For example, to have
symbols download to the c:\symbols directory you would enter this string:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

https://blogs.msdn.microsoft.com/usbcoreblog/2009/10/06/why-doesnt-my-driver-unload/

running as SYSTEM:
https://forum.sysinternals.com/system-process-using-all-cpu_topic12233.html

cd C:\Users\dan\Desktop\kernel debug\02 LiveKD\PSTools


psexec -s -i -d "C:\Users\dan\Desktop\kernel debug\02 LiveKD\ProcessExplorer\procexp64.exe
psexec -s -i -d "C:\Users\dan\Downloads\processhacker-2.39-bin\x64\ProcessHacker.exe"
to configure symbols in procexp64 or processhacker:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols

-362-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

cd D:\Program Files (x86)\Windows Kits\10\Debuggers\x64


d:
livekd64 -vsym
y
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
!process 0 1 notmyfault64.exe
!process 0 7 notmyfault64.exe
!irp ffff980e687e8310

lected for analysis.


Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- order values. We input these into
the algorithm to produce a second-order segmentation and so forth, until the music is parsed
into a single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the
Schoenberg piece. Example 21a shows the results using one dimension -pitch alone. The first
pass segments the pitches into segments of three to six pitches; that is, the seg- mentation is
determined by the sequence of the sizes of suc- cessive unordered pitch intervals. The
segmental boundaries In essence, the algorithm looks at a string of intervals derived from the
successive values in some musical dimension in a piece of music. The string might be a series
of pitch intervals, time intervals (dEIn essence, the algorithm looks at a string of intervals
derived from the successive values in some musical dimension in a piece of music. The string
might be a series of pitch intervals, time intervals (delays), dynamic changes, and so forth.
More than one string can be selected for analysis. Then the algorithm combines the values of
each dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ Polansky algorithm: For four successive dimension values labeled A through D forming
three successive unordered intervals labeled X, Y, and Z, if the middle interval is greater
than the other two intervals, the string of values is segmented in half; the value C starts a
new segment or phrase. In its simplest form, using only one musical dimension, the algorithm
works by going through the dimension's list of un- directed intervals in threes looking for
maximum values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then xample 18. Primes enclosed in rectangles Example 18. Primes enclosed in
rectangles Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in rectangles
Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18.
Primes enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18. Primes
enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in
rectangles
<0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <1 0 3

-363-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <
131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131
402> < 131 402> < 131 402>

read the 00 Kernel Debug Guide


https://samsclass.info/126/proj/p12-kernel-debug-win10.htm
- Installing Debugging Tools for Windows
01 WIN SDK
-- Use the Windows SDK, select only the components you want to install
-- in this case, "debugging tools for windows"
- set up local kernel mode debug:
bcdedit /debug on
bcdedit /dbgsettings local
(need to reset)

02 LiveKD -
https://technet.microsoft.com/en-us/sysinternals/bb897415.aspx
If you install the tools to their default directory of \Program Files\Microsoft\Debugging Tools
for Windows, you can run LiveKD from any directory; otherwise you should copy LiveKD to the
directory in which the tools are installed
170414 - installed on drive d: - copy livekd64.exe to
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64

when prompted for sym dir, enter:


D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
srv*D:\Program Files (x86)\Windows
Kits\10\Debuggers\x64\livekd64_Symbols*http://msdl.microsoft.com/download/symbols

side issue : symbol store


https://www.howtogeek.com/236195/how-to-find-out-which-build-and-version-of-windows-10-you-have/
alt-I -> about
command line:
cmd> systeminfo
cmd> ver
cmd> winver
cmd> wmic os get buildnumber,caption,CSDVersion /format:csv
cmd> wmic os get /value

other sites:
https://blogs.technet.microsoft.com/markrussinovich/2005/08/17/unkillable-processes/
https://www.windows-commandline.com/find-windows-os-version-from-command/
http://stackoverflow.com/questions/30159714/i-want-to-know-what-wmic-os-get-name-version-csdversi
on-command-returns-for-w

Debugger reference->debugger commands->Kernel-mode extension commands


https://msdn.microsoft.com/en-us/library/windows/hardware/ff564717(v=vs.85).aspx
!process

https://msdn.microsoft.com/en-us/library/windows/hardware/ff563812(v=vs.85).aspx
!irp

process explorer: configure symbols


Configure Symbols: on Windows NT and higher, if you want Process Explorer to resolve addresses
for thread start addresses in the threads tab of the process properties dialog and the thread
stack window then configure symbols by first downloading the Debugging Tools for Windows
package from Microsoft's web site and installing it in its default directory. Open the
Configure Symbols dialog and specify the path to the dbghelp.dll that's in the Debugging Tools
directory and have the symbol engine download symbols on demand from Microsoft to a directory
on your disk by entering a symbol server string for the symbol path. For example, to have
symbols download to the c:\symbols directory you would enter this string:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols

-364-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

https://blogs.msdn.microsoft.com/usbcoreblog/2009/10/06/why-doesnt-my-driver-unload/

running as SYSTEM:
https://forum.sysinternals.com/system-process-using-all-cpu_topic12233.html

cd C:\Users\dan\Desktop\kernel debug\02 LiveKD\PSTools


psexec -s -i -d "C:\Users\dan\Desktop\kernel debug\02 LiveKD\ProcessExplorer\procexp64.exe
psexec -s -i -d "C:\Users\dan\Downloads\processhacker-2.39-bin\x64\ProcessHacker.exe"
to configure symbols in procexp64 or processhacker:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

cd D:\Program Files (x86)\Windows Kits\10\Debuggers\x64


d:
livekd64 -vsym
y
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
!process 0 1 notmyfault64.exe
!process 0 7 notmyfault64.exe
!irp ffff980e687e8310

In essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- order values. We input these into
the algorithm to produce a second-order segmentation and so forth, until the music is parsed
into a single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the
Schoenberg piece. Example 21a shows the results using one dimension -pitch alone. The first
pass segments the pitches into segments of three to six pitches; that is, the seg- mentation is
determined by the sequence of the sizes of suc- cessive unordered pitch intervals. The
segmental boundaries In essence, the algorithm looks at a string of intervals derived from the
successive values in some musical dimension in a piece of music. The string might be a series
of pitch intervals, time intervals (delays), dynamic changes, and so forth. More than one
string can be selected for analysis. Then the algorithm combines the values of each dimension's
suc- cessive intervals according to a user-specified average which assigns a relative "weight"
to each of the dimensions. Example 20 illustrates the principle of the Tenney/ Polansky
algorithm: For four successive dimension values labeled A through D forming three successive
unordered intervals labeled X, Y, and Z, if the middle interval is greater than the other two
intervals, the string of values is segmented in half; the value C starts a new segment or
phrase. In its simplest form, using only one musical dimension, the algorithm works by going
through the dimension's list of un- directed intervals in threes looking for maximum values and
segmenting accordingly. This results in a series of successive segments (or phrases). We can
then average the values in each of the output segments to get a series of new higher- order
values. We input these into the algorithm to produce a second-order segmentation and so forth,
until the music is parsed into a single segment. To illustrate the Tenney/Polansky Algorithm,
we perform it on the Schoenberg piece. Example 21a shows the results using one dimension -pitch
alone. The first pass segments the pitches into segments of three to six pitches; that is, the
seg- mentation is determined by the sequence of the sizes of suc- cessive unordered pitch

-365-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
intervals. The segmental boundaries In essence, the algorithm looks at a string of intervals
derived from the successive values in some musical dimension in a piece of music. The string
might be a series of pitch intervals, time intervals (delays), dynamic changes, and so forth.
More than one string can be selected for analysis. Then the algorithm combines the values of
each dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ P

olansky algorithm: For four successive dimension values labeled A through D forming three
successive unordered intervals labeled X, Y, and Z, if the middle interval is greater than the
other two intervals, the string of values is segmented in half; the value C starts a new
segment or phrase. In its simplest form, using only one musical dimension, the algorithm works
by going through the dimension's list of un- directed intervals in threes looking for maximum
values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then average the values in each of the output segments to get a series of new
higher- order values. We input these into the algorithm to produce a second-order segmentation
and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More than one string can be selected for analysis.
Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest

-366-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- or

der values. We input these into the algorithm to produce a second-order segmentation and so
forth, until the music is parsed into a single segment. To illustrate the Tenney/Polansky
Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results using one
dimension -pitch alone. The first pass segments the pitches into segments of three to six
pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc- cessive
unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at a string
of intervals derived from the successive values in some musical dimension in a piece of music.
The string might be a series of pitch intervals, time intervals (delays), dynamic changes, and
so forth. More than one string can be selected for analysis. Then the algorithm combines the
values of each dimension's suc- cessive intervals according to a user-specified average which
assigns a relative "weight" to each of the dimensions. Example 20 illustrates the principle of
the Tenney/ Polansky algorithm: For four successive dimension values labeled A through D
forming three successive unordered intervals labeled X, Y, and Z, if the middle interval is
greater than the other two intervals, the string of values is segmented in half; the value C
starts a new segment or phrase. In its simplest form, using only one musical dimension, the
algorithm works by going through the dimension's list of un- directed intervals in threes
looking for maximum values and segmenting accordingly. This results in a series of successive
segments (or phrases). We can then average the values in each of the output segments to get a
series of new higher- order values. We input these into the algorithm to produce a second-order
segmentation and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More

than one string can be selected for analysis. Then the algorithm combines the values of each
dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ Polansky algorithm: For four successive dimension values labeled A through D forming
three successive unordered intervals labeled X, Y, and Z, if the middle interval is greater
than the other two intervals, the string of values is segmented in half; the value C starts a
new segment or phrase. In its simplest form, using only one musical dimension, the algorithm
works by going through the dimension's list of un- directed intervals in threes looking for
maximum values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then average the values in each of the output segments to get a series of new
higher- order values. We input these into the algorithm to produce a second-order segmentation
and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-

-367-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More than one string can be selected for analysis.
Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimensi

on values labeled A through D forming three successive unordered intervals labeled X, Y, and Z,
if the middle interval is greater than the other two intervals, the string of values is
segmented in half; the value C starts a new segment or phrase. In its simplest form, using only
one musical dimension, the algorithm works by going through the dimension's list of un-
directed intervals in threes looking for maximum values and segmenting accordingly. This
results in a series of successive segments (or phrases). We can then average the values in each
of the output segments to get a series of new higher- order values. We input these into the
algorithm to produce a second-order segmentation and so forth, until the music is parsed into a
single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg
piece. Example 21a shows the results using one dimension -pitch alone. The first pass segments
the pitches into segments of three to six pitches; that is, the seg- mentation is determined by
the sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries
are shown by vertical lines. The results are quite reasonable. For instance, the four pitches
<E, Ft, G, F> in phrase 2 are segmented out of the rest of the measure since they fall in a
lower register from the others. Phrases 4 and 5 seem seg- mented correctly; the first is
divided into two segments, the second into one. And in general, the boundaries of these
first-level segments never contradict our more intuitively de- rived phrase structure. The
second pass works on the aver- ages of the values in each level-1 segment. These averages are
simply the center pitch of the bandwidth (measured in semitones) of each level-1 segment. The
intervals between the series of bandwidths forming level 2 are the input to the second pass of
the algorithm. The resulting second-level seg- mentation divides the piece in half in the
middle of the third phrase, which contradicts our six-phrase structure. That the second-pass
parsing is at variance with our phrase structure is not an embarrassment, for we are taking
pitch intervals as the only criterion for segmentation. Let us ex- amine Example 21b with the
algorithm's taking only time spans between notes as input. Here the unit of time is a
thirty-second note. Once again the first level basically con- forms to our ideas of the phrase
structure, with two excep- tions. Likewise, the second pass partitions the stream of dura-
tions so that it has an exception inherited from level 1; the last phrase is divided in half,
with its first part serving as a conclusion to the second-level segment that starts at phrase
4. Finally, Example 21c shows the algorithm's output using both duration and pitch. The initial
values of the previous examples are simply added together. This time the results get

(1961) The Compositional Matrix. Baldwin, NY: Music Teachers National Assoc.
(1962) Tonal Harmony in Concept and Practice (3rd ed., 1979). New York: Holt, Rinehart and
Winston.
(1967) SNOBOL3 Primer: An Introduction to the Computer Programming Language. Cambridge, MA: MIT
Press.
(1973) The Structure of Atonal Music. New Haven: Yale Univ. Press.
(1978) The Harmonic Organization of The Rite of Spring. New Haven: Yale Univ. Press.

-368-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(1982) Introduction to Schenkerian Analysis (with Steven E. Gilbert). New York: W. W. Norton.
(1995) The American Popular Ballad of the Golden Era: 1924-1950. Princeton: Princeton Univ.
Press.
(1998) The Atonal Music of Anton Webern. New Haven: Yale Univ. Press.
(2001) Listening to Classic American Popular Songs. New Haven: Yale Univ. Press.
See also[edit]
Forte number
References[edit]
Jump up ^ "In memoriam Allen Forte, music theorist". news.yale.edu. October 17, 2014.
Jump up ^ http://news.yale.edu/2014/10/17/memoriam-allen-forte-music-theorist
Jump up ^ Allen Forte, "Secrets of Melody: Line and Design in the Songs of Cole Porter,"
Musical Quarterly 77/4 (1993), unnumbered note on 644-645.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 8.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 9-10; and Berry, "Our Festschrift for Allen: An Introduction and
Conclusion," in A Music-Theoretical Matrix: Essays in Honor of Allen Forte (Part V), ed. David
Carson Berry, Gamut 6/2 (2013), 3.
Jump up ^ Allen Forte, "A Theory of Set-Complexes for Music," Journal of Music Theory 8/2
(1964): 136-183.
Jump up ^ David Carson Berry, "Theory," sect. 5.iv (Pitch-class set theory), in The Grove
Dictionary of American Music, 2nd edition, ed. Charles Hiroshi Garrett (New York: Oxford
University Press, 2013), 8:175-176.
External links[edit]

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance
ry Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common

-369-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

-370-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-371-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-372-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-373-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).

-374-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).

-375-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).

-376-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that

-377-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-378-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-379-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-380-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-381-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-382-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They

-383-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for

-384-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not

-385-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not

-386-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-387-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-388-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-389-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand

-390-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are

-391-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are

-392-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are

-393-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-394-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-395-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-396-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a

-397-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-398-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-399-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for

-400-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not

-401-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not

-402-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto

-403-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-404-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-405-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-406-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are

-407-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are

-408-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are

-409-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

-410-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https

-411-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https

-412-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a

-413-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-414-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-415-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch

-416-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

-417-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability

-418-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability

-419-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-420-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-421-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-422-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
Bobs Atonal Theory Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A
'
.NB: A
n

-423-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave

-424-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-425-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-426-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-427-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not

-428-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not

-429-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not

-430-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch

-431-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-432-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-433-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of

-434-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are

-435-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are

-436-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and

-437-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

-438-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https

-439-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https

-440-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-441-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-442-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-443-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch

-444-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability

-445-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability

-446-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability

-447-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-448-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-449-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand

-450-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar

-451-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-452-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-453-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

-454-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https

-455-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https

-456-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe
Forte is well known for his book The Structure of Atonal Music (1973), which traces many of its
roots to an article of a decade earlier: "A Theory of Set-Complexes for Music" (1964).[6] In
these works, he "applied set-theoretic principles to the analysis of unordered collections of
pitch classes, called pitch-class sets (pc sets). [...] The basic goal of Forte's theory was to
define the various relationships that existed among the relevant sets of a work, so that
contextual coherence could be demonstrated." Although the methodology derived from Fortes work
"has had its detractors ... textbooks on post-tonal analysis now routinely teach it (to varying
degrees)."[7]

Forte published analyses of the works of Webern and Berg and wrote about Schenkerian analysis
and music of the Great American Songbook. A complete, annotated bibliography of his
publications appears in the previously cited article, Berry, "The Twin Legacies of a
Scholar-Teacher." Excluding items only edited by Forte, it lists ten books, sixty-three
articles, and thirty-six other types publications, from 1955 through early 2009

Forte was also the editor of the Journal of Music Theory during an important period in its
development, from volume 4/2 (1960) through 11/1 (1967). His involvement with the journal,
including many biographical details, is addressed in David Carson Berry, "Journal of Music
Theory under Allen Forte's Editorship," Journal of Music Theory 50/1 (2006): 7-23.

Honors and Awards[edit]


He has been honored by two Festschriften (homage volumes). The first, in commemoration of his
seventieth birthday, was published in 1997 and edited by his former students James M. Baker,
David W. Beach, and Jonathan W. Bernard (FA12, FA6, and FA11, according to Berry's list). It
was titled Music Theory in Concept and Practice (a title derived from Forte's 1962
undergraduate textbook, Tonal Harmony in Concept and Practice). The second was serialized in

-457-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
five installments of Gamut: The Journal of the Music Theory Society of the Mid-Atlantic,
between 2009 and 2013. It was edited by Forte's former student David Carson Berry (FA72) and
was titled A Music-Theoretical Matrix: Essays in Honor of Allen Forte (a title derived from
Forte's 1961 monograph, A Compositional Matrix). It included twenty-two articles by Forte's
former doctoral advisees, and three special features: a previously unpublished article by
Forte, on Gershwin songs; a collection of tributes and reminiscences from forty-two of his
former advisees; and an annotated register of his publications and advisees.

Personal life[edit]
Forte was married to the French-born pianist Madeleine (Hsu) Forte, emerita professor of piano
at Boise State University.

Bibliography (Books only)[edit]


(1955) Contemporary Tone-Structures. New York: Bureau of Publications, Columbia Univ. Teachers
College.
note isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,

-458-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-459-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-460-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch

-461-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-462-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-463-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between

-464-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-465-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-466-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-467-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829

-468-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829

-469-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829

-470-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,

-471-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while

-472-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while

-473-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while

-474-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-475-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-476-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
Forte is well known for his book The Str
Personal life[edit]
Forte was married to the French-born pianist Madeleine (Hsu) Forte, emerita professor of piano
at Boise State University.

Bibliography (Books only)[edit]


(1955) Contemporary Tone-Structures. New York: Bureau of Publications, Columbia Univ. Teachers

-477-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
College.
(1961) The Compositional Matrix. Baldwin, NY: Music Teachers National Assoc.
(1962) Tonal Harmony in Concept and Practice (3rd ed., 1979). New York: Holt, Rinehart and
Winston.
(1967) SNOBOL3 Primer: An Introduction to the Computer Programming Language. Cambridge, MA: MIT
Press.
(1973) The Structure of Atonal Music. New Haven: Yale Univ. Press.
(1978) The Harmonic Organization of The Rite of Spring. New Haven: Yale Univ. Press.
(1982) Introduction to Schenkerian Analysis (with Steven E. Gilbert). New York: W. W. Norton.
(1995) The American Popular Ballad of the Golden Era: 1924-1950. Princeton: Princeton Univ.
Press.
(1998) The Atonal Music of Anton Webern. New Haven: Yale Univ. Press.
(2001) Listening to Classic American Popular Songs. New Haven: Yale Univ. Press.
See also[edit]
Forte number
References[edit]
Jump up ^ "In memoriam Allen Forte, music theorist". news.yale.edu. October 17, 2014.
Jump up ^ http://news.yale.edu/2014/10/17/memoriam-allen-forte-music-theorist
Jump up ^ Allen Forte, "Secrets of Melody: Line and Design in the Songs of Cole Porter,"
Musical Quarterly 77/4 (1993), unnumbered note on 644-645.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 8.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 9-10; and Berry, "Our Festschrift for Allen: An Introduction and
Conclusion," in A Music-Theoretical Matrix: Essays in Honor of Allen Forte (Part V), ed. David
Carson Berry, Gamut 6/2 (2013), 3.
Jump up ^ Allen Forte, "A Theory of Set-Complexes for Music," Journal of Music Theory 8/2
(1964): 136-183.
Jump up ^ David Carson Berry, "Theory," sect. 5.iv (Pitch-class set theory), in The Grove
Dictionary of American Music, 2nd edition, ed. Charles Hiroshi Garrett (New York: Oxford
University Press, 2013), 8:175-176.
External links[edit]

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance
ry Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

-478-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch

-479-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

-480-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

httpsructure of Atonal Music (1973), which traces many of its roots to an article of a decade
earlier: "A Theory of Set-Complexes for Music" (1964).[6] In these works, he "applied
set-theoretic principles to the analysis of unordered collections of pitch classes, called
pitch-class sets (pc sets). [...] The basic goal of Forte's theory was to define the various
relationships that existed among the relevant sets of a work, so that contextual coherence
could be demonstrated." Although the methodology derived from Fortes work "has had its
detractors ... textbooks on post-tonal analysis now routinely teach it (to varying degrees)."[7]

Forte published analyses of the works of Webern and Berg and wrote about Schenkerian analysis
and music of the Great American Songbook. A complete, annotated bibliography of his
publications appears in the previously cited article, Berry, "The Twin Legacies of a
Scholar-Teacher." Excluding items only edited by Forte, it lists ten books, sixty-three
articles, and thirty-six other types publications, from 1955 through early 2009

Forte was also the editor of the Journal of Music Theory during an important period in its
development, from volume 4/2 (1960) through 11/1 (1967). His involvement with the journal,
including many biographical details, is addressed in David Carson Berry, "Journal of Music
Theory under Allen Forte's Editorship," Journal of Music Theory 50/1 (2006): 7-23.

Honors and Awards[edit]


He has been honored by two Festschriften (homage volumes). The first, in commemoration of his
seventieth birthday, was published in 1997 and edited by his former students James M. Baker,
David W. Beach, and Jonathan W. Bernard (FA12, FA6, and FA11, according to Berry's list). It
was titled Music Theory in Concept and Practice (a title derived from Forte's 1962
undergraduate textbook, Tonal Harmony in Concept and Practice). The second was serialized in
five installments of Gamut: The Journal of the Music Theory Society of the Mid-Atlantic,
between 2009 and 2013. It was edited by Forte's former student David Carson Berry (FA72) and
was titled A Music-Theoretical Matrix: Essays in Honor of Allen Forte (a title derived from
Forte's 1961 monograph, A Compositional Matrix). It included twenty-two articles by Forte's
former doctoral advisees, and three special features: a previously unpublished article by
Forte, on Gershwin songs; a collection of tributes and reminiscences from forty-two of his
former advisees; and an annotated register of his publications and advisees.
specified average which assigns a relative "weight" to each of the dimensions. Example 20
illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successivead the 00
Kernel Debug Guide
https://samsclass.info/126/proj/p12-kernel-debug-win10.htm
- Installing Debugging Tools for Windows
01 WIN SDK
-- Use the Windows SDK, select only the components you want to install
-- in this case, "debugging tools for windows"
- set up local kernel mode debug:
bcdedit /debug on
bcdedit /dbgsettings local
(need to reset)

02 LiveKD -
https://technet.microsoft.com/en-us/sysinternals/bb897415.aspx

-481-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
If you install the tools to their default directory of \Program Files\Microsoft\Debugging Tools
for Windows, you can run LiveKD from any directory; otherwise you should copy LiveKD to the
directory in which the tools are installed
170414 - installed on drive d: - copy livekd64.exe to
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64

when prompted for sym dir, enter:


D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
srv*D:\Program Files (x86)\Windows
Kits\10\Debuggers\x64\livekd64_Symbols*http://msdl.microsoft.com/download/symbols

side issue : symbol store


https://www.howtogeek.com/236195/how-to-find-out-which-build-and-version-of-windows-10-you-have/
alt-I -> about
command line:
cmd> systeminfo
cmd> ver
cmd> winver
cmd> wmic os get buildnumber,caption,CSDVersion /format:csv
cmd> wmic os get /value

other sites:
https://blogs.technet.microsoft.com/markrussinovich/2005/08/17/unkillable-processes/
https://www.windows-commandline.com/find-windows-os-version-from-command/
http://stackoverflow.com/questions/30159714/i-want-to-know-what-wmic-os-get-name-version-csdversi
on-command-returns-for-w

Debugger reference->debugger commands->Kernel-mode extension commands


https://msdn.microsoft.com/en-us/library/windows/hardware/ff564717(v=vs.85).aspx
!process

https://msdn.microsoft.com/en-us/library/windows/hardware/ff563812(v=vs.85).aspx
!irp

process explorer: configure symbols


Configure Symbols: on Windows NT and higher, if you want Process Explorer to resolve addresses
for thread start addresses in the threads tab of the process properties dialog and the thread
stack window then configure symbols by first downloading the Debugging Tools for Windows
package from Microsoft's web site and installing it in its default directory. Open the
Configure Symbols dialog and specify the path to the dbghelp.dll that's in the Debugging Tools
directory and have the symbol engine download symbols on demand from Microsoft to a directory
on your disk by entering a symbol server string for the symbol path. For example, to have
symbols download to the c:\symbols directory you would enter this string:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

https://blogs.msdn.microsoft.com/usbcoreblog/2009/10/06/why-doesnt-my-driver-unload/

running as SYSTEM:
https://forum.sysinternals.com/system-process-using-all-cpu_topic12233.html

cd C:\Users\dan\Desktop\kernel debug\02 LiveKD\PSTools


psexec -s -i -d "C:\Users\dan\Desktop\kernel debug\02 LiveKD\ProcessExplorer\procexp64.exe
psexec -s -i -d "C:\Users\dan\Downloads\processhacker-2.39-bin\x64\ProcessHacker.exe"
to configure symbols in procexp64 or processhacker:

-482-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

cd D:\Program Files (x86)\Windows Kits\10\Debuggers\x64


d:
livekd64 -vsym
y
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
!process 0 1 notmyfault64.exe
!process 0 7 notmyfault64.exe
!irp ffff980e687e8310

lected for analysis.


Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- order values. We input these into
the algorithm to produce a second-order segmentation and so forth, until the music is parsed
into a single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the
Schoenberg piece. Example 21a shows the results using one dimension -pitch alone. The first
pass segments the pitches into segments of three to six pitches; that is, the seg- mentation is
determined by the sequence of the sizes of suc- cessive unordered pitch intervals. The
segmental boundaries In essence, the algorithm looks at a string of intervals derived from the
successive values in some musical dimension in a piece of music. The string might be a series
of pitch intervals, time intervals (dEIn essence, the algorithm looks at a string of intervals
derived from the successive values in some musical dimension in a piece of music. The string
might be a series of pitch intervals, time intervals (delays), dynamic changes, and so forth.
More than one string can be selected for analysis. Then the algorithm combines the values of
each dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ Polansky algorithm: For four successive dimension values labeled A through D forming
three successive unordered intervals labeled X, Y, and Z, if the middle interval is greater
than the other two intervals, the string of values is segmented in half; the value C starts a
new segment or phrase. In its simplest form, using only one musical dimension, the algorithm
works by going through the dimension's list of un- directed intervals in threes looking for
maximum values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then xample 18. Primes enclosed in rectangles Example 18. Primes enclosed in
rectangles Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in rectangles
Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18.
Primes enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18. Primes
enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in
rectangles

-483-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
<0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <1 0 3
2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <
131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131
402> < 131 402> < 131 402>

read the 00 Kernel Debug Guide


https://samsclass.info/126/proj/p12-kernel-debug-win10.htm
- Installing Debugging Tools for Windows
01 WIN SDK
-- Use the Windows SDK, select only the components you want to install
-- in this case, "debugging tools for windows"
- set up local kernel mode debug:
bcdedit /debug on
bcdedit /dbgsettings local
(need to reset)

02 LiveKD -
https://technet.microsoft.com/en-us/sysinternals/bb897415.aspx
If you install the tools to their default directory of \Program Files\Microsoft\Debugging Tools
for Windows, you can run LiveKD from any directory; otherwise you should copy LiveKD to the
directory in which the tools are installed
170414 - installed on drive d: - copy livekd64.exe to
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64

when prompted for sym dir, enter:


D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
srv*D:\Program Files (x86)\Windows
Kits\10\Debuggers\x64\livekd64_Symbols*http://msdl.microsoft.com/download/symbols

side issue : symbol store


https://www.howtogeek.com/236195/how-to-find-out-which-build-and-version-of-windows-10-you-have/
alt-I -> about
command line:
cmd> systeminfo
cmd> ver
cmd> winver
cmd> wmic os get buildnumber,caption,CSDVersion /format:csv
cmd> wmic os get /value

other sites:
https://blogs.technet.microsoft.com/markrussinovich/2005/08/17/unkillable-processes/
https://www.windows-commandline.com/find-windows-os-version-from-command/
http://stackoverflow.com/questions/30159714/i-want-to-know-what-wmic-os-get-name-version-csdversi
on-command-returns-for-w

Debugger reference->debugger commands->Kernel-mode extension commands


https://msdn.microsoft.com/en-us/library/windows/hardware/ff564717(v=vs.85).aspx
!process

https://msdn.microsoft.com/en-us/library/windows/hardware/ff563812(v=vs.85).aspx
!irp

process explorer: configure symbols


Configure Symbols: on Windows NT and higher, if you want Process Explorer to resolve addresses
for thread start addresses in the threads tab of the process properties dialog and the thread
stack window then configure symbols by first downloading the Debugging Tools for Windows
package from Microsoft's web site and installing it in its default directory. Open the
Configure Symbols dialog and specify the path to the dbghelp.dll that's in the Debugging Tools
directory and have the symbol engine download symbols on demand from Microsoft to a directory
on your disk by entering a symbol server string for the symbol path. For example, to have
symbols download to the c:\symbols directory you would enter this string:

-484-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

https://blogs.msdn.microsoft.com/usbcoreblog/2009/10/06/why-doesnt-my-driver-unload/

running as SYSTEM:
https://forum.sysinternals.com/system-process-using-all-cpu_topic12233.html

cd C:\Users\dan\Desktop\kernel debug\02 LiveKD\PSTools


psexec -s -i -d "C:\Users\dan\Desktop\kernel debug\02 LiveKD\ProcessExplorer\procexp64.exe
psexec -s -i -d "C:\Users\dan\Downloads\processhacker-2.39-bin\x64\ProcessHacker.exe"
to configure symbols in procexp64 or processhacker:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

cd D:\Program Files (x86)\Windows Kits\10\Debuggers\x64


d:
livekd64 -vsym
y
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
!process 0 1 notmyfault64.exe
!process 0 7 notmyfault64.exe
!irp ffff980e687e8310

In essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- order values. We input these into
the algorithm to produce a second-order segmentation and so forth, until the music is parsed
into a single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the
Schoenberg piece. Example 21a shows the results using one dimension -pitch alone. The first
pass segments the pitches into segments of three to six pitches; that is, the seg- mentation is
determined by the sequence of the sizes of suc- cessive unordered pitch intervals. The
segmental boundaries In essence, the algorithm looks at a string of intervals derived from the
successive values in some musical dimension in a piece of music. The string might be a series
of pitch intervals, time intervals (delays), dynamic changes, and so forth. More than one
string can be selected for analysis. Then the algorithm combines the values of each dimension's
suc- cessive intervals according to a user-specified average which assigns a relative "weight"
to each of the dimensions. Example 20 illustrates the principle of the Tenney/ Polansky
algorithm: For four successive dimension values labeled A through D forming three successive
unordered intervals labeled X, Y, and Z, if the middle interval is greater than the other two
intervals, the string of values is segmented in half; the value C starts a new segment or
phrase. In its simplest form, using only one musical dimension, the algorithm works by going
through the dimension's list of un- directed intervals in threes looking for maximum values and
segmenting accordingly. This results in a series of successive segments (or phrases). We can
then average the values in each of the output segments to get a series of new higher- order
values. We input these into the algorithm to produce a second-order segmentation and so forth,
until the music is parsed into a single segment. To illustrate the Tenney/Polansky Algorithm,
we perform it on the Schoenberg piece. Example 21a shows the results using one dimension -pitch
alone. The first pass segments the pitches into segments of three to six pitches; that is, the

-485-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
seg- mentation is determined by the sequence of the sizes of suc- cessive unordered pitch
intervals. The segmental boundaries In essence, the algorithm looks at a string of intervals
derived from the successive values in some musical dimension in a piece of music. The string
might be a series of pitch intervals, time intervals (delays), dynamic changes, and so forth.
More than one string can be selected for analysis. Then the algorithm combines the values of
each dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ P

olansky algorithm: For four successive dimension values labeled A through D forming three
successive unordered intervals labeled X, Y, and Z, if the middle interval is greater than the
other two intervals, the string of values is segmented in half; the value C starts a new
segment or phrase. In its simplest form, using only one musical dimension, the algorithm works
by going through the dimension's list of un- directed intervals in threes looking for maximum
values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then average the values in each of the output segments to get a series of new
higher- order values. We input these into the algorithm to produce a second-order segmentation
and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More than one string can be selected for analysis.
Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string

-486-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- or

der values. We input these into the algorithm to produce a second-order segmentation and so
forth, until the music is parsed into a single segment. To illustrate the Tenney/Polansky
Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results using one
dimension -pitch alone. The first pass segments the pitches into segments of three to six
pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc- cessive
unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at a string
of intervals derived from the successive values in some musical dimension in a piece of music.
The string might be a series of pitch intervals, time intervals (delays), dynamic changes, and
so forth. More than one string can be selected for analysis. Then the algorithm combines the
values of each dimension's suc- cessive intervals according to a user-specified average which
assigns a relative "weight" to each of the dimensions. Example 20 illustrates the principle of
the Tenney/ Polansky algorithm: For four successive dimension values labeled A through D
forming three successive unordered intervals labeled X, Y, and Z, if the middle interval is
greater than the other two intervals, the string of values is segmented in half; the value C
starts a new segment or phrase. In its simplest form, using only one musical dimension, the
algorithm works by going through the dimension's list of un- directed intervals in threes
looking for maximum values and segmenting accordingly. This results in a series of successive
segments (or phrases). We can then average the values in each of the output segments to get a
series of new higher- order values. We input these into the algorithm to produce a second-order
segmentation and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More

than one string can be selected for analysis. Then the algorithm combines the values of each
dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ Polansky algorithm: For four successive dimension values labeled A through D forming
three successive unordered intervals labeled X, Y, and Z, if the middle interval is greater
than the other two intervals, the string of values is segmented in half; the value C starts a
new segment or phrase. In its simplest form, using only one musical dimension, the algorithm
works by going through the dimension's list of un- directed intervals in threes looking for
maximum values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then average the values in each of the output segments to get a series of new
higher- order values. We input these into the algorithm to produce a second-order segmentation
and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to

-487-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More than one string can be selected for analysis.
Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimensi

on values labeled A through D forming three successive unordered intervals labeled X, Y, and Z,
if the middle interval is greater than the other two intervals, the string of values is
segmented in half; the value C starts a new segment or phrase. In its simplest form, using only
one musical dimension, the algorithm works by going through the dimension's list of un-
directed intervals in threes looking for maximum values and segmenting accordingly. This
results in a series of successive segments (or phrases). We can then average the values in each
of the output segments to get a series of new higher- order values. We input these into the
algorithm to produce a second-order segmentation and so forth, until the music is parsed into a
single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg
piece. Example 21a shows the results using one dimension -pitch alone. The first pass segments
the pitches into segments of three to six pitches; that is, the seg- mentation is determined by
the sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries
are shown by vertical lines. The results are quite reasonable. For instance, the four pitches
<E, Ft, G, F> in phrase 2 are segmented out of the rest of the measure since they fall in a
lower register from the others. Phrases 4 and 5 seem seg- mented correctly; the first is
divided into two segments, the second into one. And in general, the boundaries of these
first-level segments never contradict our more intuitively de- rived phrase structure. The
second pass works on the aver- ages of the values in each level-1 segment. These averages are
simply the center pitch of the bandwidth (measured in semitones) of each level-1 segment. The
intervals between the series of bandwidths forming level 2 are the input to the second pass of
the algorithm. The resulting second-level seg- mentation divides the piece in half in the
middle of the third phrase, which contradicts our six-phrase structure. That the second-pass
parsing is at variance with our phrase structure is not an embarrassment, for we are taking
pitch intervals as the only criterion for segmentation. Let us ex- amine Example 21b with the
algorithm's taking only time spans between notes as input. Here the unit of time is a
thirty-second note. Once again the first level basically con- forms to our ideas of the phrase
structure, with two excep- tions. Likewise, the second pass partitions the stream of dura-
tions so that it has an exception inherited from level 1; the last phrase is divided in half,
with its first part serving as a conclusion to the second-level segment that starts at phrase
4. Finally, Example 21c shows the algorithm's output using both duration and pitch. The initial
values of the previous examples are simply added together. This time the results get

(1961) The Compositional Matrix. Baldwin, NY: Music Teachers National Assoc.
(1962) Tonal Harmony in Concept and Practice (3rd ed., 1979). New York: Holt, Rinehart and
Winston.
(1967) SNOBOL3 Primer: An Introduction to the Computer Programming Language. Cambridge, MA: MIT
Press.
(1973) The Structure of Atonal Music. New Haven: Yale Univ. Press.

-488-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(1978) The Harmonic Organization of The Rite of Spring. New Haven: Yale Univ. Press.
(1982) Introduction to Schenkerian Analysis (with Steven E. Gilbert). New York: W. W. Norton.
(1995) The American Popular Ballad of the Golden Era: 1924-1950. Princeton: Princeton Univ.
Press.
(1998) The Atonal Music of Anton Webern. New Haven: Yale Univ. Press.
(2001) Listening to Classic American Popular Songs. New Haven: Yale Univ. Press.
See also[edit]
Forte number
References[edit]
Jump up ^ "In memoriam Allen Forte, music theorist". news.yale.edu. October 17, 2014.
Jump up ^ http://news.yale.edu/2014/10/17/memoriam-allen-forte-music-theorist
Jump up ^ Allen Forte, "Secrets of Melody: Line and Design in the Songs of Cole Porter,"
Musical Quarterly 77/4 (1993), unnumbered note on 644-645.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 8.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 9-10; and Berry, "Our Festschrift for Allen: An Introduction and
Conclusion," in A Music-Theoretical Matrix: Essays in Honor of Allen Forte (Part V), ed. David
Carson Berry, Gamut 6/2 (2013), 3.
Jump up ^ Allen Forte, "A Theory of Set-Complexes for Music," Journal of Music Theory 8/2
(1964): 136-183.
Jump up ^ David Carson Berry, "Theory," sect. 5.iv (Pitch-class set theory), in The Grove
Dictionary of American Music, 2nd edition, ed. Charles Hiroshi Garrett (New York: Oxford
University Press, 2013), 8:175-176.
External links[edit]

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance
ry Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?

-489-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl

-490-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-491-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-492-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-493-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the

-494-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the

-495-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the

-496-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a

-497-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a

-498-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a

-499-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a

-500-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated

-501-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated

-502-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir

-503-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

-504-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the

-505-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the

-506-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-507-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-508-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-509-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is

-510-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time

-511-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time

-512-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time

-513-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a

-514-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a

-515-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a

-516-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

-517-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-518-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-519-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

-520-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the

-521-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the

-522-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

-523-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto
thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-524-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-525-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-526-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time

-527-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time

-528-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time

-529-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

-530-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-531-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-532-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-533-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-534-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-535-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome

-536-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for

-537-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be

-538-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be

-539-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-540-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-541-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-542-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
Bobs Atonal Theory Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A
'
.NB: A

-543-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the

-544-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-545-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-546-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-547-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the

-548-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the

-549-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the

-550-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions

-551-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-552-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-553-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...

-554-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time

-555-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time

-556-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional

-557-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

-558-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-559-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-560-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-561-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-562-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-563-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions

-564-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be

-565-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be

-566-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be

-567-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-568-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-569-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith

-570-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds

-571-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-572-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-573-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

-574-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-575-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-576-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe
Forte is well known for his book The Structure of Atonal Music (1973), which traces many of its
roots to an article of a decade earlier: "A Theory of Set-Complexes for Music" (1964).[6] In
these works, he "applied set-theoretic principles to the analysis of unordered collections of
pitch classes, called pitch-class sets (pc sets). [...] The basic goal of Forte's theory was to
define the various relationships that existed among the relevant sets of a work, so that
contextual coherence could be demonstrated." Although the methodology derived from Fortes work
"has had its detractors ... textbooks on post-tonal analysis now routinely teach it (to varying
degrees)."[7]

Forte published analyses of the works of Webern and Berg and wrote about Schenkerian analysis
and music of the Great American Songbook. A complete, annotated bibliography of his
publications appears in the previously cited article, Berry, "The Twin Legacies of a
Scholar-Teacher." Excluding items only edited by Forte, it lists ten books, sixty-three
articles, and thirty-six other types publications, from 1955 through early 2009

Forte was also the editor of the Journal of Music Theory during an important period in its
development, from volume 4/2 (1960) through 11/1 (1967). His involvement with the journal,
including many biographical details, is addressed in David Carson Berry, "Journal of Music
Theory under Allen Forte's Editorship," Journal of Music Theory 50/1 (2006): 7-23.

Honors and Awards[edit]


He has been honored by two Festschriften (homage volumes). The first, in commemoration of his
seventieth birthday, was published in 1997 and edited by his former students James M. Baker,
David W. Beach, and Jonathan W. Bernard (FA12, FA6, and FA11, according to Berry's list). It
was titled Music Theory in Concept and Practice (a title derived from Forte's 1962

-577-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
undergraduate textbook, Tonal Harmony in Concept and Practice). The second was serialized in
five installments of Gamut: The Journal of the Music Theory Society of the Mid-Atlantic,
between 2009 and 2013. It was edited by Forte's former student David Carson Berry (FA72) and
was titled A Music-Theoretical Matrix: Essays in Honor of Allen Forte (a title derived from
Forte's 1961 monograph, A Compositional Matrix). It included twenty-two articles by Forte's
former doctoral advisees, and three special features: a previously unpublished article by
Forte, on Gershwin songs; a collection of tributes and reminiscences from forty-two of his
former advisees; and an annotated register of his publications and advisees.

Personal life[edit]
Forte was married to the French-born pianist Madeleine (Hsu) Forte, emerita professor of piano
at Boise State University.

Bibliography (Books only)[edit]


(1955) Contemporary Tone-Structures. New York: Bureau of Publications, Columbia Univ. Teachers
College.
note isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch

-578-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-579-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-580-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of

-581-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are

-582-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are

-583-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and

-584-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

-585-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https

-586-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https

-587-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-588-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-589-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-590-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch

-591-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability

-592-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability

-593-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability

-594-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-595-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-596-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
Forte is well known for his book The Str
Personal life[edit]
Forte was married to the French-born pianist Madeleine (Hsu) Forte, emerita professor of piano
at Boise State University.

Bibliography (Books only)[edit]

-597-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(1955) Contemporary Tone-Structures. New York: Bureau of Publications, Columbia Univ. Teachers
College.
(1961) The Compositional Matrix. Baldwin, NY: Music Teachers National Assoc.
(1962) Tonal Harmony in Concept and Practice (3rd ed., 1979). New York: Holt, Rinehart and
Winston.
(1967) SNOBOL3 Primer: An Introduction to the Computer Programming Language. Cambridge, MA: MIT
Press.
(1973) The Structure of Atonal Music. New Haven: Yale Univ. Press.
(1978) The Harmonic Organization of The Rite of Spring. New Haven: Yale Univ. Press.
(1982) Introduction to Schenkerian Analysis (with Steven E. Gilbert). New York: W. W. Norton.
(1995) The American Popular Ballad of the Golden Era: 1924-1950. Princeton: Princeton Univ.
Press.
(1998) The Atonal Music of Anton Webern. New Haven: Yale Univ. Press.
(2001) Listening to Classic American Popular Songs. New Haven: Yale Univ. Press.
See also[edit]
Forte number
References[edit]
Jump up ^ "In memoriam Allen Forte, music theorist". news.yale.edu. October 17, 2014.
Jump up ^ http://news.yale.edu/2014/10/17/memoriam-allen-forte-music-theorist
Jump up ^ Allen Forte, "Secrets of Melody: Line and Design in the Songs of Cole Porter,"
Musical Quarterly 77/4 (1993), unnumbered note on 644-645.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 8.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 9-10; and Berry, "Our Festschrift for Allen: An Introduction and
Conclusion," in A Music-Theoretical Matrix: Essays in Honor of Allen Forte (Part V), ed. David
Carson Berry, Gamut 6/2 (2013), 3.
Jump up ^ Allen Forte, "A Theory of Set-Complexes for Music," Journal of Music Theory 8/2
(1964): 136-183.
Jump up ^ David Carson Berry, "Theory," sect. 5.iv (Pitch-class set theory), in The Grove
Dictionary of American Music, 2nd edition, ed. Charles Hiroshi Garrett (New York: Oxford
University Press, 2013), 8:175-176.
External links[edit]

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance
ry Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null

-598-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch

-599-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

-600-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

httpsructure of Atonal Music (1973), which traces many of its roots to an article of a decade
earlier: "A Theory of Set-Complexes for Music" (1964).[6] In these works, he "applied
set-theoretic principles to the analysis of unordered collections of pitch classes, called
pitch-class sets (pc sets). [...] The basic goal of Forte's theory was to define the various
relationships that existed among the relevant sets of a work, so that contextual coherence
could be demonstrated." Although the methodology derived from Fortes work "has had its
detractors ... textbooks on post-tonal analysis now routinely teach it (to varying degrees)."[7]

Forte published analyses of the works of Webern and Berg and wrote about Schenkerian analysis
and music of the Great American Songbook. A complete, annotated bibliography of his
publications appears in the previously cited article, Berry, "The Twin Legacies of a
Scholar-Teacher." Excluding items only edited by Forte, it lists ten books, sixty-three
articles, and thirty-six other types publications, from 1955 through early 2009

Forte was also the editor of the Journal of Music Theory during an important period in its
development, from volume 4/2 (1960) through 11/1 (1967). His involvement with the journal,
including many biographical details, is addressed in David Carson Berry, "Journal of Music
Theory under Allen Forte's Editorship," Journal of Music Theory 50/1 (2006): 7-23.

Honors and Awards[edit]


He has been honored by two Festschriften (homage volumes). The first, in commemoration of his
seventieth birthday, was published in 1997 and edited by his former students James M. Baker,
David W. Beach, and Jonathan W. Bernard (FA12, FA6, and FA11, according to Berry's list). It
was titled Music Theory in Concept and Practice (a title derived from Forte's 1962
undergraduate textbook, Tonal Harmony in Concept and Practice). The second was serialized in
five installments of Gamut: The Journal of the Music Theory Society of the Mid-Atlantic,
between 2009 and 2013. It was edited by Forte's former student David Carson Berry (FA72) and
was titled A Music-Theoretical Matrix: Essays in Honor of Allen Forte (a title derived from
Forte's 1961 monograph, A Compositional Matrix). It included twenty-two articles by Forte's
former doctoral advisees, and three special features: a previously unpublished article by
Forte, on Gershwin songs; a collection of tributes and reminiscences from forty-two of his
former advisees; and an annotated register of his publications and advisees.
specified average which assigns a relative "weight" to each of the dimensions. Example 20
illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successivead the 00
Kernel Debug Guide
https://samsclass.info/126/proj/p12-kernel-debug-win10.htm
- Installing Debugging Tools for Windows
01 WIN SDK
-- Use the Windows SDK, select only the components you want to install
-- in this case, "debugging tools for windows"
- set up local kernel mode debug:
bcdedit /debug on
bcdedit /dbgsettings local
(need to reset)

02 LiveKD -

-601-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://technet.microsoft.com/en-us/sysinternals/bb897415.aspx
If you install the tools to their default directory of \Program Files\Microsoft\Debugging Tools
for Windows, you can run LiveKD from any directory; otherwise you should copy LiveKD to the
directory in which the tools are installed
170414 - installed on drive d: - copy livekd64.exe to
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64

when prompted for sym dir, enter:


D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
srv*D:\Program Files (x86)\Windows
Kits\10\Debuggers\x64\livekd64_Symbols*http://msdl.microsoft.com/download/symbols

side issue : symbol store


https://www.howtogeek.com/236195/how-to-find-out-which-build-and-version-of-windows-10-you-have/
alt-I -> about
command line:
cmd> systeminfo
cmd> ver
cmd> winver
cmd> wmic os get buildnumber,caption,CSDVersion /format:csv
cmd> wmic os get /value

other sites:
https://blogs.technet.microsoft.com/markrussinovich/2005/08/17/unkillable-processes/
https://www.windows-commandline.com/find-windows-os-version-from-command/
http://stackoverflow.com/questions/30159714/i-want-to-know-what-wmic-os-get-name-version-csdversi
on-command-returns-for-w

Debugger reference->debugger commands->Kernel-mode extension commands


https://msdn.microsoft.com/en-us/library/windows/hardware/ff564717(v=vs.85).aspx
!process

https://msdn.microsoft.com/en-us/library/windows/hardware/ff563812(v=vs.85).aspx
!irp

process explorer: configure symbols


Configure Symbols: on Windows NT and higher, if you want Process Explorer to resolve addresses
for thread start addresses in the threads tab of the process properties dialog and the thread
stack window then configure symbols by first downloading the Debugging Tools for Windows
package from Microsoft's web site and installing it in its default directory. Open the
Configure Symbols dialog and specify the path to the dbghelp.dll that's in the Debugging Tools
directory and have the symbol engine download symbols on demand from Microsoft to a directory
on your disk by entering a symbol server string for the symbol path. For example, to have
symbols download to the c:\symbols directory you would enter this string:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

https://blogs.msdn.microsoft.com/usbcoreblog/2009/10/06/why-doesnt-my-driver-unload/

running as SYSTEM:
https://forum.sysinternals.com/system-process-using-all-cpu_topic12233.html

cd C:\Users\dan\Desktop\kernel debug\02 LiveKD\PSTools


psexec -s -i -d "C:\Users\dan\Desktop\kernel debug\02 LiveKD\ProcessExplorer\procexp64.exe
psexec -s -i -d "C:\Users\dan\Downloads\processhacker-2.39-bin\x64\ProcessHacker.exe"

-602-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
to configure symbols in procexp64 or processhacker:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

cd D:\Program Files (x86)\Windows Kits\10\Debuggers\x64


d:
livekd64 -vsym
y
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
!process 0 1 notmyfault64.exe
!process 0 7 notmyfault64.exe
!irp ffff980e687e8310

lected for analysis.


Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- order values. We input these into
the algorithm to produce a second-order segmentation and so forth, until the music is parsed
into a single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the
Schoenberg piece. Example 21a shows the results using one dimension -pitch alone. The first
pass segments the pitches into segments of three to six pitches; that is, the seg- mentation is
determined by the sequence of the sizes of suc- cessive unordered pitch intervals. The
segmental boundaries In essence, the algorithm looks at a string of intervals derived from the
successive values in some musical dimension in a piece of music. The string might be a series
of pitch intervals, time intervals (dEIn essence, the algorithm looks at a string of intervals
derived from the successive values in some musical dimension in a piece of music. The string
might be a series of pitch intervals, time intervals (delays), dynamic changes, and so forth.
More than one string can be selected for analysis. Then the algorithm combines the values of
each dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ Polansky algorithm: For four successive dimension values labeled A through D forming
three successive unordered intervals labeled X, Y, and Z, if the middle interval is greater
than the other two intervals, the string of values is segmented in half; the value C starts a
new segment or phrase. In its simplest form, using only one musical dimension, the algorithm
works by going through the dimension's list of un- directed intervals in threes looking for
maximum values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then xample 18. Primes enclosed in rectangles Example 18. Primes enclosed in
rectangles Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in rectangles
Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18.
Primes enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18. Primes
enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in

-603-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
rectangles
<0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <1 0 3
2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <
131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131
402> < 131 402> < 131 402>

read the 00 Kernel Debug Guide


https://samsclass.info/126/proj/p12-kernel-debug-win10.htm
- Installing Debugging Tools for Windows
01 WIN SDK
-- Use the Windows SDK, select only the components you want to install
-- in this case, "debugging tools for windows"
- set up local kernel mode debug:
bcdedit /debug on
bcdedit /dbgsettings local
(need to reset)

02 LiveKD -
https://technet.microsoft.com/en-us/sysinternals/bb897415.aspx
If you install the tools to their default directory of \Program Files\Microsoft\Debugging Tools
for Windows, you can run LiveKD from any directory; otherwise you should copy LiveKD to the
directory in which the tools are installed
170414 - installed on drive d: - copy livekd64.exe to
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64

when prompted for sym dir, enter:


D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
srv*D:\Program Files (x86)\Windows
Kits\10\Debuggers\x64\livekd64_Symbols*http://msdl.microsoft.com/download/symbols

side issue : symbol store


https://www.howtogeek.com/236195/how-to-find-out-which-build-and-version-of-windows-10-you-have/
alt-I -> about
command line:
cmd> systeminfo
cmd> ver
cmd> winver
cmd> wmic os get buildnumber,caption,CSDVersion /format:csv
cmd> wmic os get /value

other sites:
https://blogs.technet.microsoft.com/markrussinovich/2005/08/17/unkillable-processes/
https://www.windows-commandline.com/find-windows-os-version-from-command/
http://stackoverflow.com/questions/30159714/i-want-to-know-what-wmic-os-get-name-version-csdversi
on-command-returns-for-w

Debugger reference->debugger commands->Kernel-mode extension commands


https://msdn.microsoft.com/en-us/library/windows/hardware/ff564717(v=vs.85).aspx
!process

https://msdn.microsoft.com/en-us/library/windows/hardware/ff563812(v=vs.85).aspx
!irp

process explorer: configure symbols


Configure Symbols: on Windows NT and higher, if you want Process Explorer to resolve addresses
for thread start addresses in the threads tab of the process properties dialog and the thread
stack window then configure symbols by first downloading the Debugging Tools for Windows
package from Microsoft's web site and installing it in its default directory. Open the
Configure Symbols dialog and specify the path to the dbghelp.dll that's in the Debugging Tools
directory and have the symbol engine download symbols on demand from Microsoft to a directory
on your disk by entering a symbol server string for the symbol path. For example, to have

-604-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
symbols download to the c:\symbols directory you would enter this string:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

https://blogs.msdn.microsoft.com/usbcoreblog/2009/10/06/why-doesnt-my-driver-unload/

running as SYSTEM:
https://forum.sysinternals.com/system-process-using-all-cpu_topic12233.html

cd C:\Users\dan\Desktop\kernel debug\02 LiveKD\PSTools


psexec -s -i -d "C:\Users\dan\Desktop\kernel debug\02 LiveKD\ProcessExplorer\procexp64.exe
psexec -s -i -d "C:\Users\dan\Downloads\processhacker-2.39-bin\x64\ProcessHacker.exe"
to configure symbols in procexp64 or processhacker:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

cd D:\Program Files (x86)\Windows Kits\10\Debuggers\x64


d:
livekd64 -vsym
y
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
!process 0 1 notmyfault64.exe
!process 0 7 notmyfault64.exe
!irp ffff980e687e8310

In essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- order values. We input these into
the algorithm to produce a second-order segmentation and so forth, until the music is parsed
into a single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the
Schoenberg piece. Example 21a shows the results using one dimension -pitch alone. The first
pass segments the pitches into segments of three to six pitches; that is, the seg- mentation is
determined by the sequence of the sizes of suc- cessive unordered pitch intervals. The
segmental boundaries In essence, the algorithm looks at a string of intervals derived from the
successive values in some musical dimension in a piece of music. The string might be a series
of pitch intervals, time intervals (delays), dynamic changes, and so forth. More than one
string can be selected for analysis. Then the algorithm combines the values of each dimension's
suc- cessive intervals according to a user-specified average which assigns a relative "weight"
to each of the dimensions. Example 20 illustrates the principle of the Tenney/ Polansky
algorithm: For four successive dimension values labeled A through D forming three successive
unordered intervals labeled X, Y, and Z, if the middle interval is greater than the other two
intervals, the string of values is segmented in half; the value C starts a new segment or
phrase. In its simplest form, using only one musical dimension, the algorithm works by going
through the dimension's list of un- directed intervals in threes looking for maximum values and
segmenting accordingly. This results in a series of successive segments (or phrases). We can
then average the values in each of the output segments to get a series of new higher- order
values. We input these into the algorithm to produce a second-order segmentation and so forth,
until the music is parsed into a single segment. To illustrate the Tenney/Polansky Algorithm,
we perform it on the Schoenberg piece. Example 21a shows the results using one dimension -pitch

-605-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
alone. The first pass segments the pitches into segments of three to six pitches; that is, the
seg- mentation is determined by the sequence of the sizes of suc- cessive unordered pitch
intervals. The segmental boundaries In essence, the algorithm looks at a string of intervals
derived from the successive values in some musical dimension in a piece of music. The string
might be a series of pitch intervals, time intervals (delays), dynamic changes, and so forth.
More than one string can be selected for analysis. Then the algorithm combines the values of
each dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ P

olansky algorithm: For four successive dimension values labeled A through D forming three
successive unordered intervals labeled X, Y, and Z, if the middle interval is greater than the
other two intervals, the string of values is segmented in half; the value C starts a new
segment or phrase. In its simplest form, using only one musical dimension, the algorithm works
by going through the dimension's list of un- directed intervals in threes looking for maximum
values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then average the values in each of the output segments to get a series of new
higher- order values. We input these into the algorithm to produce a second-order segmentation
and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More than one string can be selected for analysis.
Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals

-606-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- or

der values. We input these into the algorithm to produce a second-order segmentation and so
forth, until the music is parsed into a single segment. To illustrate the Tenney/Polansky
Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results using one
dimension -pitch alone. The first pass segments the pitches into segments of three to six
pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc- cessive
unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at a string
of intervals derived from the successive values in some musical dimension in a piece of music.
The string might be a series of pitch intervals, time intervals (delays), dynamic changes, and
so forth. More than one string can be selected for analysis. Then the algorithm combines the
values of each dimension's suc- cessive intervals according to a user-specified average which
assigns a relative "weight" to each of the dimensions. Example 20 illustrates the principle of
the Tenney/ Polansky algorithm: For four successive dimension values labeled A through D
forming three successive unordered intervals labeled X, Y, and Z, if the middle interval is
greater than the other two intervals, the string of values is segmented in half; the value C
starts a new segment or phrase. In its simplest form, using only one musical dimension, the
algorithm works by going through the dimension's list of un- directed intervals in threes
looking for maximum values and segmenting accordingly. This results in a series of successive
segments (or phrases). We can then average the values in each of the output segments to get a
series of new higher- order values. We input these into the algorithm to produce a second-order
segmentation and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More

than one string can be selected for analysis. Then the algorithm combines the values of each
dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ Polansky algorithm: For four successive dimension values labeled A through D forming
three successive unordered intervals labeled X, Y, and Z, if the middle interval is greater
than the other two intervals, the string of values is segmented in half; the value C starts a
new segment or phrase. In its simplest form, using only one musical dimension, the algorithm
works by going through the dimension's list of un- directed intervals in threes looking for
maximum values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then average the values in each of the output segments to get a series of new
higher- order values. We input these into the algorithm to produce a second-order segmentation
and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results

-607-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More than one string can be selected for analysis.
Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimensi

on values labeled A through D forming three successive unordered intervals labeled X, Y, and Z,
if the middle interval is greater than the other two intervals, the string of values is
segmented in half; the value C starts a new segment or phrase. In its simplest form, using only
one musical dimension, the algorithm works by going through the dimension's list of un-
directed intervals in threes looking for maximum values and segmenting accordingly. This
results in a series of successive segments (or phrases). We can then average the values in each
of the output segments to get a series of new higher- order values. We input these into the
algorithm to produce a second-order segmentation and so forth, until the music is parsed into a
single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg
piece. Example 21a shows the results using one dimension -pitch alone. The first pass segments
the pitches into segments of three to six pitches; that is, the seg- mentation is determined by
the sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries
are shown by vertical lines. The results are quite reasonable. For instance, the four pitches
<E, Ft, G, F> in phrase 2 are segmented out of the rest of the measure since they fall in a
lower register from the others. Phrases 4 and 5 seem seg- mented correctly; the first is
divided into two segments, the second into one. And in general, the boundaries of these
first-level segments never contradict our more intuitively de- rived phrase structure. The
second pass works on the aver- ages of the values in each level-1 segment. These averages are
simply the center pitch of the bandwidth (measured in semitones) of each level-1 segment. The
intervals between the series of bandwidths forming level 2 are the input to the second pass of
the algorithm. The resulting second-level seg- mentation divides the piece in half in the
middle of the third phrase, which contradicts our six-phrase structure. That the second-pass
parsing is at variance with our phrase structure is not an embarrassment, for we are taking
pitch intervals as the only criterion for segmentation. Let us ex- amine Example 21b with the
algorithm's taking only time spans between notes as input. Here the unit of time is a
thirty-second note. Once again the first level basically con- forms to our ideas of the phrase
structure, with two excep- tions. Likewise, the second pass partitions the stream of dura-
tions so that it has an exception inherited from level 1; the last phrase is divided in half,
with its first part serving as a conclusion to the second-level segment that starts at phrase
4. Finally, Example 21c shows the algorithm's output using both duration and pitch. The initial
values of the previous examples are simply added together. This time the results get

(1961) The Compositional Matrix. Baldwin, NY: Music Teachers National Assoc.
(1962) Tonal Harmony in Concept and Practice (3rd ed., 1979). New York: Holt, Rinehart and
Winston.
(1967) SNOBOL3 Primer: An Introduction to the Computer Programming Language. Cambridge, MA: MIT
Press.

-608-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(1973) The Structure of Atonal Music. New Haven: Yale Univ. Press.
(1978) The Harmonic Organization of The Rite of Spring. New Haven: Yale Univ. Press.
(1982) Introduction to Schenkerian Analysis (with Steven E. Gilbert). New York: W. W. Norton.
(1995) The American Popular Ballad of the Golden Era: 1924-1950. Princeton: Princeton Univ.
Press.
(1998) The Atonal Music of Anton Webern. New Haven: Yale Univ. Press.
(2001) Listening to Classic American Popular Songs. New Haven: Yale Univ. Press.
See also[edit]
Forte number
References[edit]
Jump up ^ "In memoriam Allen Forte, music theorist". news.yale.edu. October 17, 2014.
Jump up ^ http://news.yale.edu/2014/10/17/memoriam-allen-forte-music-theorist
Jump up ^ Allen Forte, "Secrets of Melody: Line and Design in the Songs of Cole Porter,"
Musical Quarterly 77/4 (1993), unnumbered note on 644-645.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 8.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 9-10; and Berry, "Our Festschrift for Allen: An Introduction and
Conclusion," in A Music-Theoretical Matrix: Essays in Honor of Allen Forte (Part V), ed. David
Carson Berry, Gamut 6/2 (2013), 3.
Jump up ^ Allen Forte, "A Theory of Set-Complexes for Music," Journal of Music Theory 8/2
(1964): 136-183.
Jump up ^ David Carson Berry, "Theory," sect. 5.iv (Pitch-class set theory), in The Grove
Dictionary of American Music, 2nd edition, ed. Charles Hiroshi Garrett (New York: Oxford
University Press, 2013), 8:175-176.
External links[edit]

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance
ry Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A

-609-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between

-610-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-611-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-612-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-613-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829

-614-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829

-615-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829

-616-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,

-617-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while

-618-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while

-619-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while

-620-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-621-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-622-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note

-623-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

-624-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation

-625-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation

-626-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-627-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-628-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-629-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf

-630-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known

-631-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known

-632-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known

-633-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while

-634-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while

-635-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while

-636-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

-637-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-638-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-639-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

-640-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation

-641-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation

-642-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

-643-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-644-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-645-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-646-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known

-647-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known

-648-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known

-649-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

-650-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-651-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-652-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-653-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-654-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-655-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in

-656-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can

-657-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the

-658-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the

-659-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-660-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-661-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-662-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
Bobs Atonal Theory Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A
'

-663-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created

-664-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-665-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-666-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-667-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation

-668-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation

-669-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation

-670-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...

-671-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-672-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-673-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.

-674-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known

-675-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known

-676-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for

-677-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

-678-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-679-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-680-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-681-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-682-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-683-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...

-684-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the

-685-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the

-686-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the

-687-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-688-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-689-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As

-690-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch

-691-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-692-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-693-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

-694-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-695-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-696-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe
Forte is well known for his book The Structure of Atonal Music (1973), which traces many of its
roots to an article of a decade earlier: "A Theory of Set-Complexes for Music" (1964).[6] In
these works, he "applied set-theoretic principles to the analysis of unordered collections of
pitch classes, called pitch-class sets (pc sets). [...] The basic goal of Forte's theory was to
define the various relationships that existed among the relevant sets of a work, so that
contextual coherence could be demonstrated." Although the methodology derived from Fortes work
"has had its detractors ... textbooks on post-tonal analysis now routinely teach it (to varying
degrees)."[7]

Forte published analyses of the works of Webern and Berg and wrote about Schenkerian analysis
and music of the Great American Songbook. A complete, annotated bibliography of his
publications appears in the previously cited article, Berry, "The Twin Legacies of a
Scholar-Teacher." Excluding items only edited by Forte, it lists ten books, sixty-three
articles, and thirty-six other types publications, from 1955 through early 2009

Forte was also the editor of the Journal of Music Theory during an important period in its
development, from volume 4/2 (1960) through 11/1 (1967). His involvement with the journal,
including many biographical details, is addressed in David Carson Berry, "Journal of Music
Theory under Allen Forte's Editorship," Journal of Music Theory 50/1 (2006): 7-23.

Honors and Awards[edit]


He has been honored by two Festschriften (homage volumes). The first, in commemoration of his
seventieth birthday, was published in 1997 and edited by his former students James M. Baker,
David W. Beach, and Jonathan W. Bernard (FA12, FA6, and FA11, according to Berry's list). It

-697-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
was titled Music Theory in Concept and Practice (a title derived from Forte's 1962
undergraduate textbook, Tonal Harmony in Concept and Practice). The second was serialized in
five installments of Gamut: The Journal of the Music Theory Society of the Mid-Atlantic,
between 2009 and 2013. It was edited by Forte's former student David Carson Berry (FA72) and
was titled A Music-Theoretical Matrix: Essays in Honor of Allen Forte (a title derived from
Forte's 1961 monograph, A Compositional Matrix). It included twenty-two articles by Forte's
former doctoral advisees, and three special features: a previously unpublished article by
Forte, on Gershwin songs; a collection of tributes and reminiscences from forty-two of his
former advisees; and an annotated register of his publications and advisees.

Personal life[edit]
Forte was married to the French-born pianist Madeleine (Hsu) Forte, emerita professor of piano
at Boise State University.

Bibliography (Books only)[edit]


(1955) Contemporary Tone-Structures. New York: Bureau of Publications, Columbia Univ. Teachers
College.
note isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions

-698-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-699-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-700-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...

-701-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time

-702-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time

-703-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional

-704-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

-705-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-706-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-707-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-708-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-709-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-710-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions

-711-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be

-712-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be

-713-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be

-714-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-715-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-716-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
Forte is well known for his book The Str
Personal life[edit]
Forte was married to the French-born pianist Madeleine (Hsu) Forte, emerita professor of piano
at Boise State University.

-717-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Bibliography (Books only)[edit]
(1955) Contemporary Tone-Structures. New York: Bureau of Publications, Columbia Univ. Teachers
College.
(1961) The Compositional Matrix. Baldwin, NY: Music Teachers National Assoc.
(1962) Tonal Harmony in Concept and Practice (3rd ed., 1979). New York: Holt, Rinehart and
Winston.
(1967) SNOBOL3 Primer: An Introduction to the Computer Programming Language. Cambridge, MA: MIT
Press.
(1973) The Structure of Atonal Music. New Haven: Yale Univ. Press.
(1978) The Harmonic Organization of The Rite of Spring. New Haven: Yale Univ. Press.
(1982) Introduction to Schenkerian Analysis (with Steven E. Gilbert). New York: W. W. Norton.
(1995) The American Popular Ballad of the Golden Era: 1924-1950. Princeton: Princeton Univ.
Press.
(1998) The Atonal Music of Anton Webern. New Haven: Yale Univ. Press.
(2001) Listening to Classic American Popular Songs. New Haven: Yale Univ. Press.
See also[edit]
Forte number
References[edit]
Jump up ^ "In memoriam Allen Forte, music theorist". news.yale.edu. October 17, 2014.
Jump up ^ http://news.yale.edu/2014/10/17/memoriam-allen-forte-music-theorist
Jump up ^ Allen Forte, "Secrets of Melody: Line and Design in the Songs of Cole Porter,"
Musical Quarterly 77/4 (1993), unnumbered note on 644-645.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 8.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 9-10; and Berry, "Our Festschrift for Allen: An Introduction and
Conclusion," in A Music-Theoretical Matrix: Essays in Honor of Allen Forte (Part V), ed. David
Carson Berry, Gamut 6/2 (2013), 3.
Jump up ^ Allen Forte, "A Theory of Set-Complexes for Music," Journal of Music Theory 8/2
(1964): 136-183.
Jump up ^ David Carson Berry, "Theory," sect. 5.iv (Pitch-class set theory), in The Grove
Dictionary of American Music, 2nd edition, ed. Charles Hiroshi Garrett (New York: Oxford
University Press, 2013), 8:175-176.
External links[edit]

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance
ry Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or

-718-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome

-719-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for

-720-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

httpsructure of Atonal Music (1973), which traces many of its roots to an article of a decade
earlier: "A Theory of Set-Complexes for Music" (1964).[6] In these works, he "applied
set-theoretic principles to the analysis of unordered collections of pitch classes, called
pitch-class sets (pc sets). [...] The basic goal of Forte's theory was to define the various
relationships that existed among the relevant sets of a work, so that contextual coherence
could be demonstrated." Although the methodology derived from Fortes work "has had its
detractors ... textbooks on post-tonal analysis now routinely teach it (to varying degrees)."[7]

Forte published analyses of the works of Webern and Berg and wrote about Schenkerian analysis
and music of the Great American Songbook. A complete, annotated bibliography of his
publications appears in the previously cited article, Berry, "The Twin Legacies of a
Scholar-Teacher." Excluding items only edited by Forte, it lists ten books, sixty-three
articles, and thirty-six other types publications, from 1955 through early 2009

Forte was also the editor of the Journal of Music Theory during an important period in its
development, from volume 4/2 (1960) through 11/1 (1967). His involvement with the journal,
including many biographical details, is addressed in David Carson Berry, "Journal of Music
Theory under Allen Forte's Editorship," Journal of Music Theory 50/1 (2006): 7-23.

Honors and Awards[edit]


He has been honored by two Festschriften (homage volumes). The first, in commemoration of his
seventieth birthday, was published in 1997 and edited by his former students James M. Baker,
David W. Beach, and Jonathan W. Bernard (FA12, FA6, and FA11, according to Berry's list). It
was titled Music Theory in Concept and Practice (a title derived from Forte's 1962
undergraduate textbook, Tonal Harmony in Concept and Practice). The second was serialized in
five installments of Gamut: The Journal of the Music Theory Society of the Mid-Atlantic,
between 2009 and 2013. It was edited by Forte's former student David Carson Berry (FA72) and
was titled A Music-Theoretical Matrix: Essays in Honor of Allen Forte (a title derived from
Forte's 1961 monograph, A Compositional Matrix). It included twenty-two articles by Forte's
former doctoral advisees, and three special features: a previously unpublished article by
Forte, on Gershwin songs; a collection of tributes and reminiscences from forty-two of his
former advisees; and an annotated register of his publications and advisees.
specified average which assigns a relative "weight" to each of the dimensions. Example 20
illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successivead the 00
Kernel Debug Guide
https://samsclass.info/126/proj/p12-kernel-debug-win10.htm
- Installing Debugging Tools for Windows
01 WIN SDK
-- Use the Windows SDK, select only the components you want to install
-- in this case, "debugging tools for windows"
- set up local kernel mode debug:
bcdedit /debug on
bcdedit /dbgsettings local
(need to reset)

-721-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
02 LiveKD -
https://technet.microsoft.com/en-us/sysinternals/bb897415.aspx
If you install the tools to their default directory of \Program Files\Microsoft\Debugging Tools
for Windows, you can run LiveKD from any directory; otherwise you should copy LiveKD to the
directory in which the tools are installed
170414 - installed on drive d: - copy livekd64.exe to
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64

when prompted for sym dir, enter:


D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
srv*D:\Program Files (x86)\Windows
Kits\10\Debuggers\x64\livekd64_Symbols*http://msdl.microsoft.com/download/symbols

side issue : symbol store


https://www.howtogeek.com/236195/how-to-find-out-which-build-and-version-of-windows-10-you-have/
alt-I -> about
command line:
cmd> systeminfo
cmd> ver
cmd> winver
cmd> wmic os get buildnumber,caption,CSDVersion /format:csv
cmd> wmic os get /value

other sites:
https://blogs.technet.microsoft.com/markrussinovich/2005/08/17/unkillable-processes/
https://www.windows-commandline.com/find-windows-os-version-from-command/
http://stackoverflow.com/questions/30159714/i-want-to-know-what-wmic-os-get-name-version-csdversi
on-command-returns-for-w

Debugger reference->debugger commands->Kernel-mode extension commands


https://msdn.microsoft.com/en-us/library/windows/hardware/ff564717(v=vs.85).aspx
!process

https://msdn.microsoft.com/en-us/library/windows/hardware/ff563812(v=vs.85).aspx
!irp

process explorer: configure symbols


Configure Symbols: on Windows NT and higher, if you want Process Explorer to resolve addresses
for thread start addresses in the threads tab of the process properties dialog and the thread
stack window then configure symbols by first downloading the Debugging Tools for Windows
package from Microsoft's web site and installing it in its default directory. Open the
Configure Symbols dialog and specify the path to the dbghelp.dll that's in the Debugging Tools
directory and have the symbol engine download symbols on demand from Microsoft to a directory
on your disk by entering a symbol server string for the symbol path. For example, to have
symbols download to the c:\symbols directory you would enter this string:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

https://blogs.msdn.microsoft.com/usbcoreblog/2009/10/06/why-doesnt-my-driver-unload/

running as SYSTEM:
https://forum.sysinternals.com/system-process-using-all-cpu_topic12233.html

cd C:\Users\dan\Desktop\kernel debug\02 LiveKD\PSTools


psexec -s -i -d "C:\Users\dan\Desktop\kernel debug\02 LiveKD\ProcessExplorer\procexp64.exe

-722-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
psexec -s -i -d "C:\Users\dan\Downloads\processhacker-2.39-bin\x64\ProcessHacker.exe"
to configure symbols in procexp64 or processhacker:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

cd D:\Program Files (x86)\Windows Kits\10\Debuggers\x64


d:
livekd64 -vsym
y
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
!process 0 1 notmyfault64.exe
!process 0 7 notmyfault64.exe
!irp ffff980e687e8310

lected for analysis.


Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- order values. We input these into
the algorithm to produce a second-order segmentation and so forth, until the music is parsed
into a single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the
Schoenberg piece. Example 21a shows the results using one dimension -pitch alone. The first
pass segments the pitches into segments of three to six pitches; that is, the seg- mentation is
determined by the sequence of the sizes of suc- cessive unordered pitch intervals. The
segmental boundaries In essence, the algorithm looks at a string of intervals derived from the
successive values in some musical dimension in a piece of music. The string might be a series
of pitch intervals, time intervals (dEIn essence, the algorithm looks at a string of intervals
derived from the successive values in some musical dimension in a piece of music. The string
might be a series of pitch intervals, time intervals (delays), dynamic changes, and so forth.
More than one string can be selected for analysis. Then the algorithm combines the values of
each dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ Polansky algorithm: For four successive dimension values labeled A through D forming
three successive unordered intervals labeled X, Y, and Z, if the middle interval is greater
than the other two intervals, the string of values is segmented in half; the value C starts a
new segment or phrase. In its simplest form, using only one musical dimension, the algorithm
works by going through the dimension's list of un- directed intervals in threes looking for
maximum values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then xample 18. Primes enclosed in rectangles Example 18. Primes enclosed in
rectangles Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in rectangles
Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18.
Primes enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18. Primes

-723-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in
rectangles
<0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <1 0 3
2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <
131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131
402> < 131 402> < 131 402>

read the 00 Kernel Debug Guide


https://samsclass.info/126/proj/p12-kernel-debug-win10.htm
- Installing Debugging Tools for Windows
01 WIN SDK
-- Use the Windows SDK, select only the components you want to install
-- in this case, "debugging tools for windows"
- set up local kernel mode debug:
bcdedit /debug on
bcdedit /dbgsettings local
(need to reset)

02 LiveKD -
https://technet.microsoft.com/en-us/sysinternals/bb897415.aspx
If you install the tools to their default directory of \Program Files\Microsoft\Debugging Tools
for Windows, you can run LiveKD from any directory; otherwise you should copy LiveKD to the
directory in which the tools are installed
170414 - installed on drive d: - copy livekd64.exe to
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64

when prompted for sym dir, enter:


D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
srv*D:\Program Files (x86)\Windows
Kits\10\Debuggers\x64\livekd64_Symbols*http://msdl.microsoft.com/download/symbols

side issue : symbol store


https://www.howtogeek.com/236195/how-to-find-out-which-build-and-version-of-windows-10-you-have/
alt-I -> about
command line:
cmd> systeminfo
cmd> ver
cmd> winver
cmd> wmic os get buildnumber,caption,CSDVersion /format:csv
cmd> wmic os get /value

other sites:
https://blogs.technet.microsoft.com/markrussinovich/2005/08/17/unkillable-processes/
https://www.windows-commandline.com/find-windows-os-version-from-command/
http://stackoverflow.com/questions/30159714/i-want-to-know-what-wmic-os-get-name-version-csdversi
on-command-returns-for-w

Debugger reference->debugger commands->Kernel-mode extension commands


https://msdn.microsoft.com/en-us/library/windows/hardware/ff564717(v=vs.85).aspx
!process

https://msdn.microsoft.com/en-us/library/windows/hardware/ff563812(v=vs.85).aspx
!irp

process explorer: configure symbols


Configure Symbols: on Windows NT and higher, if you want Process Explorer to resolve addresses
for thread start addresses in the threads tab of the process properties dialog and the thread
stack window then configure symbols by first downloading the Debugging Tools for Windows
package from Microsoft's web site and installing it in its default directory. Open the
Configure Symbols dialog and specify the path to the dbghelp.dll that's in the Debugging Tools
directory and have the symbol engine download symbols on demand from Microsoft to a directory

-724-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
on your disk by entering a symbol server string for the symbol path. For example, to have
symbols download to the c:\symbols directory you would enter this string:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

https://blogs.msdn.microsoft.com/usbcoreblog/2009/10/06/why-doesnt-my-driver-unload/

running as SYSTEM:
https://forum.sysinternals.com/system-process-using-all-cpu_topic12233.html

cd C:\Users\dan\Desktop\kernel debug\02 LiveKD\PSTools


psexec -s -i -d "C:\Users\dan\Desktop\kernel debug\02 LiveKD\ProcessExplorer\procexp64.exe
psexec -s -i -d "C:\Users\dan\Downloads\processhacker-2.39-bin\x64\ProcessHacker.exe"
to configure symbols in procexp64 or processhacker:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

cd D:\Program Files (x86)\Windows Kits\10\Debuggers\x64


d:
livekd64 -vsym
y
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
!process 0 1 notmyfault64.exe
!process 0 7 notmyfault64.exe
!irp ffff980e687e8310

In essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- order values. We input these into
the algorithm to produce a second-order segmentation and so forth, until the music is parsed
into a single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the
Schoenberg piece. Example 21a shows the results using one dimension -pitch alone. The first
pass segments the pitches into segments of three to six pitches; that is, the seg- mentation is
determined by the sequence of the sizes of suc- cessive unordered pitch intervals. The
segmental boundaries In essence, the algorithm looks at a string of intervals derived from the
successive values in some musical dimension in a piece of music. The string might be a series
of pitch intervals, time intervals (delays), dynamic changes, and so forth. More than one
string can be selected for analysis. Then the algorithm combines the values of each dimension's
suc- cessive intervals according to a user-specified average which assigns a relative "weight"
to each of the dimensions. Example 20 illustrates the principle of the Tenney/ Polansky
algorithm: For four successive dimension values labeled A through D forming three successive
unordered intervals labeled X, Y, and Z, if the middle interval is greater than the other two
intervals, the string of values is segmented in half; the value C starts a new segment or
phrase. In its simplest form, using only one musical dimension, the algorithm works by going
through the dimension's list of un- directed intervals in threes looking for maximum values and
segmenting accordingly. This results in a series of successive segments (or phrases). We can
then average the values in each of the output segments to get a series of new higher- order
values. We input these into the algorithm to produce a second-order segmentation and so forth,
until the music is parsed into a single segment. To illustrate the Tenney/Polansky Algorithm,

-725-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
we perform it on the Schoenberg piece. Example 21a shows the results using one dimension -pitch
alone. The first pass segments the pitches into segments of three to six pitches; that is, the
seg- mentation is determined by the sequence of the sizes of suc- cessive unordered pitch
intervals. The segmental boundaries In essence, the algorithm looks at a string of intervals
derived from the successive values in some musical dimension in a piece of music. The string
might be a series of pitch intervals, time intervals (delays), dynamic changes, and so forth.
More than one string can be selected for analysis. Then the algorithm combines the values of
each dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ P

olansky algorithm: For four successive dimension values labeled A through D forming three
successive unordered intervals labeled X, Y, and Z, if the middle interval is greater than the
other two intervals, the string of values is segmented in half; the value C starts a new
segment or phrase. In its simplest form, using only one musical dimension, the algorithm works
by going through the dimension's list of un- directed intervals in threes looking for maximum
values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then average the values in each of the output segments to get a series of new
higher- order values. We input these into the algorithm to produce a second-order segmentation
and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More than one string can be selected for analysis.
Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four

-726-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- or

der values. We input these into the algorithm to produce a second-order segmentation and so
forth, until the music is parsed into a single segment. To illustrate the Tenney/Polansky
Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results using one
dimension -pitch alone. The first pass segments the pitches into segments of three to six
pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc- cessive
unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at a string
of intervals derived from the successive values in some musical dimension in a piece of music.
The string might be a series of pitch intervals, time intervals (delays), dynamic changes, and
so forth. More than one string can be selected for analysis. Then the algorithm combines the
values of each dimension's suc- cessive intervals according to a user-specified average which
assigns a relative "weight" to each of the dimensions. Example 20 illustrates the principle of
the Tenney/ Polansky algorithm: For four successive dimension values labeled A through D
forming three successive unordered intervals labeled X, Y, and Z, if the middle interval is
greater than the other two intervals, the string of values is segmented in half; the value C
starts a new segment or phrase. In its simplest form, using only one musical dimension, the
algorithm works by going through the dimension's list of un- directed intervals in threes
looking for maximum values and segmenting accordingly. This results in a series of successive
segments (or phrases). We can then average the values in each of the output segments to get a
series of new higher- order values. We input these into the algorithm to produce a second-order
segmentation and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More

than one string can be selected for analysis. Then the algorithm combines the values of each
dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ Polansky algorithm: For four successive dimension values labeled A through D forming
three successive unordered intervals labeled X, Y, and Z, if the middle interval is greater
than the other two intervals, the string of values is segmented in half; the value C starts a
new segment or phrase. In its simplest form, using only one musical dimension, the algorithm
works by going through the dimension's list of un- directed intervals in threes looking for
maximum values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then average the values in each of the output segments to get a series of new
higher- order values. We input these into the algorithm to produce a second-order segmentation
and so forth, until the music is parsed into a single segment. To illustrate the

-727-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More than one string can be selected for analysis.
Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimensi

on values labeled A through D forming three successive unordered intervals labeled X, Y, and Z,
if the middle interval is greater than the other two intervals, the string of values is
segmented in half; the value C starts a new segment or phrase. In its simplest form, using only
one musical dimension, the algorithm works by going through the dimension's list of un-
directed intervals in threes looking for maximum values and segmenting accordingly. This
results in a series of successive segments (or phrases). We can then average the values in each
of the output segments to get a series of new higher- order values. We input these into the
algorithm to produce a second-order segmentation and so forth, until the music is parsed into a
single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg
piece. Example 21a shows the results using one dimension -pitch alone. The first pass segments
the pitches into segments of three to six pitches; that is, the seg- mentation is determined by
the sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries
are shown by vertical lines. The results are quite reasonable. For instance, the four pitches
<E, Ft, G, F> in phrase 2 are segmented out of the rest of the measure since they fall in a
lower register from the others. Phrases 4 and 5 seem seg- mented correctly; the first is
divided into two segments, the second into one. And in general, the boundaries of these
first-level segments never contradict our more intuitively de- rived phrase structure. The
second pass works on the aver- ages of the values in each level-1 segment. These averages are
simply the center pitch of the bandwidth (measured in semitones) of each level-1 segment. The
intervals between the series of bandwidths forming level 2 are the input to the second pass of
the algorithm. The resulting second-level seg- mentation divides the piece in half in the
middle of the third phrase, which contradicts our six-phrase structure. That the second-pass
parsing is at variance with our phrase structure is not an embarrassment, for we are taking
pitch intervals as the only criterion for segmentation. Let us ex- amine Example 21b with the
algorithm's taking only time spans between notes as input. Here the unit of time is a
thirty-second note. Once again the first level basically con- forms to our ideas of the phrase
structure, with two excep- tions. Likewise, the second pass partitions the stream of dura-
tions so that it has an exception inherited from level 1; the last phrase is divided in half,
with its first part serving as a conclusion to the second-level segment that starts at phrase
4. Finally, Example 21c shows the algorithm's output using both duration and pitch. The initial
values of the previous examples are simply added together. This time the results get

(1961) The Compositional Matrix. Baldwin, NY: Music Teachers National Assoc.
(1962) Tonal Harmony in Concept and Practice (3rd ed., 1979). New York: Holt, Rinehart and
Winston.
(1967) SNOBOL3 Primer: An Introduction to the Computer Programming Language. Cambridge, MA: MIT

-728-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Press.
(1973) The Structure of Atonal Music. New Haven: Yale Univ. Press.
(1978) The Harmonic Organization of The Rite of Spring. New Haven: Yale Univ. Press.
(1982) Introduction to Schenkerian Analysis (with Steven E. Gilbert). New York: W. W. Norton.
(1995) The American Popular Ballad of the Golden Era: 1924-1950. Princeton: Princeton Univ.
Press.
(1998) The Atonal Music of Anton Webern. New Haven: Yale Univ. Press.
(2001) Listening to Classic American Popular Songs. New Haven: Yale Univ. Press.
See also[edit]
Forte number
References[edit]
Jump up ^ "In memoriam Allen Forte, music theorist". news.yale.edu. October 17, 2014.
Jump up ^ http://news.yale.edu/2014/10/17/memoriam-allen-forte-music-theorist
Jump up ^ Allen Forte, "Secrets of Melody: Line and Design in the Songs of Cole Porter,"
Musical Quarterly 77/4 (1993), unnumbered note on 644-645.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 8.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 9-10; and Berry, "Our Festschrift for Allen: An Introduction and
Conclusion," in A Music-Theoretical Matrix: Essays in Honor of Allen Forte (Part V), ed. David
Carson Berry, Gamut 6/2 (2013), 3.
Jump up ^ Allen Forte, "A Theory of Set-Complexes for Music," Journal of Music Theory 8/2
(1964): 136-183.
Jump up ^ David Carson Berry, "Theory," sect. 5.iv (Pitch-class set theory), in The Grove
Dictionary of American Music, 2nd edition, ed. Charles Hiroshi Garrett (New York: Oxford
University Press, 2013), 8:175-176.
External links[edit]

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance
ry Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?

-729-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and

-730-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

-731-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https

-732-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https

-733-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-734-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-735-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-736-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch

-737-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability

-738-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability

-739-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability

-740-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-741-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-742-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand

-743-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar

-744-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-745-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-746-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

-747-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https

-748-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https

-749-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f

-750-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).

-751-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).

-752-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).

-753-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability

-754-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability

-755-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability

-756-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that

-757-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-758-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-759-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar

-760-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-761-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-762-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

-763-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-764-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-765-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-766-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).

-767-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).

-768-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).

-769-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that

-770-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-771-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-772-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-773-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-774-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-775-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They

-776-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for

-777-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not

-778-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not

-779-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-780-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-781-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-782-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
Bobs Atonal Theory Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A

-783-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible

-784-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-785-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-786-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-787-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-788-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-789-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-790-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale

-791-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-792-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-793-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and

-794-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).

-795-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).

-796-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work

-797-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a

-798-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-799-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-800-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-801-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-802-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-803-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale

-804-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not

-805-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not

-806-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not

-807-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-808-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-809-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note

-810-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of

-811-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are

-812-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are

-813-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a

-814-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-815-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-816-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe
Forte is well known for his book The Structure of Atonal Music (1973), which traces many of its
roots to an article of a decade earlier: "A Theory of Set-Complexes for Music" (1964).[6] In
these works, he "applied set-theoretic principles to the analysis of unordered collections of
pitch classes, called pitch-class sets (pc sets). [...] The basic goal of Forte's theory was to
define the various relationships that existed among the relevant sets of a work, so that
contextual coherence could be demonstrated." Although the methodology derived from Fortes work
"has had its detractors ... textbooks on post-tonal analysis now routinely teach it (to varying
degrees)."[7]

Forte published analyses of the works of Webern and Berg and wrote about Schenkerian analysis
and music of the Great American Songbook. A complete, annotated bibliography of his
publications appears in the previously cited article, Berry, "The Twin Legacies of a
Scholar-Teacher." Excluding items only edited by Forte, it lists ten books, sixty-three
articles, and thirty-six other types publications, from 1955 through early 2009

Forte was also the editor of the Journal of Music Theory during an important period in its
development, from volume 4/2 (1960) through 11/1 (1967). His involvement with the journal,
including many biographical details, is addressed in David Carson Berry, "Journal of Music
Theory under Allen Forte's Editorship," Journal of Music Theory 50/1 (2006): 7-23.

Honors and Awards[edit]


He has been honored by two Festschriften (homage volumes). The first, in commemoration of his
seventieth birthday, was published in 1997 and edited by his former students James M. Baker,

-817-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
David W. Beach, and Jonathan W. Bernard (FA12, FA6, and FA11, according to Berry's list). It
was titled Music Theory in Concept and Practice (a title derived from Forte's 1962
undergraduate textbook, Tonal Harmony in Concept and Practice). The second was serialized in
five installments of Gamut: The Journal of the Music Theory Society of the Mid-Atlantic,
between 2009 and 2013. It was edited by Forte's former student David Carson Berry (FA72) and
was titled A Music-Theoretical Matrix: Essays in Honor of Allen Forte (a title derived from
Forte's 1961 monograph, A Compositional Matrix). It included twenty-two articles by Forte's
former doctoral advisees, and three special features: a previously unpublished article by
Forte, on Gershwin songs; a collection of tributes and reminiscences from forty-two of his
former advisees; and an annotated register of his publications and advisees.

Personal life[edit]
Forte was married to the French-born pianist Madeleine (Hsu) Forte, emerita professor of piano
at Boise State University.

Bibliography (Books only)[edit]


(1955) Contemporary Tone-Structures. New York: Bureau of Publications, Columbia Univ. Teachers
College.
note isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...

-818-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-819-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-820-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.

-821-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known

-822-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known

-823-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for

-824-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

-825-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-826-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-827-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-828-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-829-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-830-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...

-831-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the

-832-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the

-833-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the

-834-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-835-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-836-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
Forte is well known for his book The Str
Personal life[edit]
Forte was married to the French-born pianist Madeleine (Hsu) Forte, emerita professor of piano
at Boise State University.

-837-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Bibliography (Books only)[edit]


(1955) Contemporary Tone-Structures. New York: Bureau of Publications, Columbia Univ. Teachers
College.
(1961) The Compositional Matrix. Baldwin, NY: Music Teachers National Assoc.
(1962) Tonal Harmony in Concept and Practice (3rd ed., 1979). New York: Holt, Rinehart and
Winston.
(1967) SNOBOL3 Primer: An Introduction to the Computer Programming Language. Cambridge, MA: MIT
Press.
(1973) The Structure of Atonal Music. New Haven: Yale Univ. Press.
(1978) The Harmonic Organization of The Rite of Spring. New Haven: Yale Univ. Press.
(1982) Introduction to Schenkerian Analysis (with Steven E. Gilbert). New York: W. W. Norton.
(1995) The American Popular Ballad of the Golden Era: 1924-1950. Princeton: Princeton Univ.
Press.
(1998) The Atonal Music of Anton Webern. New Haven: Yale Univ. Press.
(2001) Listening to Classic American Popular Songs. New Haven: Yale Univ. Press.
See also[edit]
Forte number
References[edit]
Jump up ^ "In memoriam Allen Forte, music theorist". news.yale.edu. October 17, 2014.
Jump up ^ http://news.yale.edu/2014/10/17/memoriam-allen-forte-music-theorist
Jump up ^ Allen Forte, "Secrets of Melody: Line and Design in the Songs of Cole Porter,"
Musical Quarterly 77/4 (1993), unnumbered note on 644-645.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 8.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 9-10; and Berry, "Our Festschrift for Allen: An Introduction and
Conclusion," in A Music-Theoretical Matrix: Essays in Honor of Allen Forte (Part V), ed. David
Carson Berry, Gamut 6/2 (2013), 3.
Jump up ^ Allen Forte, "A Theory of Set-Complexes for Music," Journal of Music Theory 8/2
(1964): 136-183.
Jump up ^ David Carson Berry, "Theory," sect. 5.iv (Pitch-class set theory), in The Grove
Dictionary of American Music, 2nd edition, ed. Charles Hiroshi Garrett (New York: Oxford
University Press, 2013), 8:175-176.
External links[edit]

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance
ry Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate

-838-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in

-839-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can

-840-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

httpsructure of Atonal Music (1973), which traces many of its roots to an article of a decade
earlier: "A Theory of Set-Complexes for Music" (1964).[6] In these works, he "applied
set-theoretic principles to the analysis of unordered collections of pitch classes, called
pitch-class sets (pc sets). [...] The basic goal of Forte's theory was to define the various
relationships that existed among the relevant sets of a work, so that contextual coherence
could be demonstrated." Although the methodology derived from Fortes work "has had its
detractors ... textbooks on post-tonal analysis now routinely teach it (to varying degrees)."[7]

Forte published analyses of the works of Webern and Berg and wrote about Schenkerian analysis
and music of the Great American Songbook. A complete, annotated bibliography of his
publications appears in the previously cited article, Berry, "The Twin Legacies of a
Scholar-Teacher." Excluding items only edited by Forte, it lists ten books, sixty-three
articles, and thirty-six other types publications, from 1955 through early 2009

Forte was also the editor of the Journal of Music Theory during an important period in its
development, from volume 4/2 (1960) through 11/1 (1967). His involvement with the journal,
including many biographical details, is addressed in David Carson Berry, "Journal of Music
Theory under Allen Forte's Editorship," Journal of Music Theory 50/1 (2006): 7-23.

Honors and Awards[edit]


He has been honored by two Festschriften (homage volumes). The first, in commemoration of his
seventieth birthday, was published in 1997 and edited by his former students James M. Baker,
David W. Beach, and Jonathan W. Bernard (FA12, FA6, and FA11, according to Berry's list). It
was titled Music Theory in Concept and Practice (a title derived from Forte's 1962
undergraduate textbook, Tonal Harmony in Concept and Practice). The second was serialized in
five installments of Gamut: The Journal of the Music Theory Society of the Mid-Atlantic,
between 2009 and 2013. It was edited by Forte's former student David Carson Berry (FA72) and
was titled A Music-Theoretical Matrix: Essays in Honor of Allen Forte (a title derived from
Forte's 1961 monograph, A Compositional Matrix). It included twenty-two articles by Forte's
former doctoral advisees, and three special features: a previously unpublished article by
Forte, on Gershwin songs; a collection of tributes and reminiscences from forty-two of his
former advisees; and an annotated register of his publications and advisees.
specified average which assigns a relative "weight" to each of the dimensions. Example 20
illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successivead the 00
Kernel Debug Guide
https://samsclass.info/126/proj/p12-kernel-debug-win10.htm
- Installing Debugging Tools for Windows
01 WIN SDK
-- Use the Windows SDK, select only the components you want to install
-- in this case, "debugging tools for windows"
- set up local kernel mode debug:
bcdedit /debug on
bcdedit /dbgsettings local
(need to reset)

-841-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

02 LiveKD -
https://technet.microsoft.com/en-us/sysinternals/bb897415.aspx
If you install the tools to their default directory of \Program Files\Microsoft\Debugging Tools
for Windows, you can run LiveKD from any directory; otherwise you should copy LiveKD to the
directory in which the tools are installed
170414 - installed on drive d: - copy livekd64.exe to
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64

when prompted for sym dir, enter:


D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
srv*D:\Program Files (x86)\Windows
Kits\10\Debuggers\x64\livekd64_Symbols*http://msdl.microsoft.com/download/symbols

side issue : symbol store


https://www.howtogeek.com/236195/how-to-find-out-which-build-and-version-of-windows-10-you-have/
alt-I -> about
command line:
cmd> systeminfo
cmd> ver
cmd> winver
cmd> wmic os get buildnumber,caption,CSDVersion /format:csv
cmd> wmic os get /value

other sites:
https://blogs.technet.microsoft.com/markrussinovich/2005/08/17/unkillable-processes/
https://www.windows-commandline.com/find-windows-os-version-from-command/
http://stackoverflow.com/questions/30159714/i-want-to-know-what-wmic-os-get-name-version-csdversi
on-command-returns-for-w

Debugger reference->debugger commands->Kernel-mode extension commands


https://msdn.microsoft.com/en-us/library/windows/hardware/ff564717(v=vs.85).aspx
!process

https://msdn.microsoft.com/en-us/library/windows/hardware/ff563812(v=vs.85).aspx
!irp

process explorer: configure symbols


Configure Symbols: on Windows NT and higher, if you want Process Explorer to resolve addresses
for thread start addresses in the threads tab of the process properties dialog and the thread
stack window then configure symbols by first downloading the Debugging Tools for Windows
package from Microsoft's web site and installing it in its default directory. Open the
Configure Symbols dialog and specify the path to the dbghelp.dll that's in the Debugging Tools
directory and have the symbol engine download symbols on demand from Microsoft to a directory
on your disk by entering a symbol server string for the symbol path. For example, to have
symbols download to the c:\symbols directory you would enter this string:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

https://blogs.msdn.microsoft.com/usbcoreblog/2009/10/06/why-doesnt-my-driver-unload/

running as SYSTEM:
https://forum.sysinternals.com/system-process-using-all-cpu_topic12233.html

cd C:\Users\dan\Desktop\kernel debug\02 LiveKD\PSTools

-842-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
psexec -s -i -d "C:\Users\dan\Desktop\kernel debug\02 LiveKD\ProcessExplorer\procexp64.exe
psexec -s -i -d "C:\Users\dan\Downloads\processhacker-2.39-bin\x64\ProcessHacker.exe"
to configure symbols in procexp64 or processhacker:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

cd D:\Program Files (x86)\Windows Kits\10\Debuggers\x64


d:
livekd64 -vsym
y
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
!process 0 1 notmyfault64.exe
!process 0 7 notmyfault64.exe
!irp ffff980e687e8310

lected for analysis.


Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- order values. We input these into
the algorithm to produce a second-order segmentation and so forth, until the music is parsed
into a single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the
Schoenberg piece. Example 21a shows the results using one dimension -pitch alone. The first
pass segments the pitches into segments of three to six pitches; that is, the seg- mentation is
determined by the sequence of the sizes of suc- cessive unordered pitch intervals. The
segmental boundaries In essence, the algorithm looks at a string of intervals derived from the
successive values in some musical dimension in a piece of music. The string might be a series
of pitch intervals, time intervals (dEIn essence, the algorithm looks at a string of intervals
derived from the successive values in some musical dimension in a piece of music. The string
might be a series of pitch intervals, time intervals (delays), dynamic changes, and so forth.
More than one string can be selected for analysis. Then the algorithm combines the values of
each dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ Polansky algorithm: For four successive dimension values labeled A through D forming
three successive unordered intervals labeled X, Y, and Z, if the middle interval is greater
than the other two intervals, the string of values is segmented in half; the value C starts a
new segment or phrase. In its simplest form, using only one musical dimension, the algorithm
works by going through the dimension's list of un- directed intervals in threes looking for
maximum values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then xample 18. Primes enclosed in rectangles Example 18. Primes enclosed in
rectangles Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in rectangles
Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18.

-843-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Primes enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18. Primes
enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in
rectangles
<0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <1 0 3
2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <
131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131
402> < 131 402> < 131 402>

read the 00 Kernel Debug Guide


https://samsclass.info/126/proj/p12-kernel-debug-win10.htm
- Installing Debugging Tools for Windows
01 WIN SDK
-- Use the Windows SDK, select only the components you want to install
-- in this case, "debugging tools for windows"
- set up local kernel mode debug:
bcdedit /debug on
bcdedit /dbgsettings local
(need to reset)

02 LiveKD -
https://technet.microsoft.com/en-us/sysinternals/bb897415.aspx
If you install the tools to their default directory of \Program Files\Microsoft\Debugging Tools
for Windows, you can run LiveKD from any directory; otherwise you should copy LiveKD to the
directory in which the tools are installed
170414 - installed on drive d: - copy livekd64.exe to
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64

when prompted for sym dir, enter:


D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
srv*D:\Program Files (x86)\Windows
Kits\10\Debuggers\x64\livekd64_Symbols*http://msdl.microsoft.com/download/symbols

side issue : symbol store


https://www.howtogeek.com/236195/how-to-find-out-which-build-and-version-of-windows-10-you-have/
alt-I -> about
command line:
cmd> systeminfo
cmd> ver
cmd> winver
cmd> wmic os get buildnumber,caption,CSDVersion /format:csv
cmd> wmic os get /value

other sites:
https://blogs.technet.microsoft.com/markrussinovich/2005/08/17/unkillable-processes/
https://www.windows-commandline.com/find-windows-os-version-from-command/
http://stackoverflow.com/questions/30159714/i-want-to-know-what-wmic-os-get-name-version-csdversi
on-command-returns-for-w

Debugger reference->debugger commands->Kernel-mode extension commands


https://msdn.microsoft.com/en-us/library/windows/hardware/ff564717(v=vs.85).aspx
!process

https://msdn.microsoft.com/en-us/library/windows/hardware/ff563812(v=vs.85).aspx
!irp

process explorer: configure symbols


Configure Symbols: on Windows NT and higher, if you want Process Explorer to resolve addresses
for thread start addresses in the threads tab of the process properties dialog and the thread
stack window then configure symbols by first downloading the Debugging Tools for Windows
package from Microsoft's web site and installing it in its default directory. Open the
Configure Symbols dialog and specify the path to the dbghelp.dll that's in the Debugging Tools

-844-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
directory and have the symbol engine download symbols on demand from Microsoft to a directory
on your disk by entering a symbol server string for the symbol path. For example, to have
symbols download to the c:\symbols directory you would enter this string:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

https://blogs.msdn.microsoft.com/usbcoreblog/2009/10/06/why-doesnt-my-driver-unload/

running as SYSTEM:
https://forum.sysinternals.com/system-process-using-all-cpu_topic12233.html

cd C:\Users\dan\Desktop\kernel debug\02 LiveKD\PSTools


psexec -s -i -d "C:\Users\dan\Desktop\kernel debug\02 LiveKD\ProcessExplorer\procexp64.exe
psexec -s -i -d "C:\Users\dan\Downloads\processhacker-2.39-bin\x64\ProcessHacker.exe"
to configure symbols in procexp64 or processhacker:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

cd D:\Program Files (x86)\Windows Kits\10\Debuggers\x64


d:
livekd64 -vsym
y
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
!process 0 1 notmyfault64.exe
!process 0 7 notmyfault64.exe
!irp ffff980e687e8310

In essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- order values. We input these into
the algorithm to produce a second-order segmentation and so forth, until the music is parsed
into a single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the
Schoenberg piece. Example 21a shows the results using one dimension -pitch alone. The first
pass segments the pitches into segments of three to six pitches; that is, the seg- mentation is
determined by the sequence of the sizes of suc- cessive unordered pitch intervals. The
segmental boundaries In essence, the algorithm looks at a string of intervals derived from the
successive values in some musical dimension in a piece of music. The string might be a series
of pitch intervals, time intervals (delays), dynamic changes, and so forth. More than one
string can be selected for analysis. Then the algorithm combines the values of each dimension's
suc- cessive intervals according to a user-specified average which assigns a relative "weight"
to each of the dimensions. Example 20 illustrates the principle of the Tenney/ Polansky
algorithm: For four successive dimension values labeled A through D forming three successive
unordered intervals labeled X, Y, and Z, if the middle interval is greater than the other two
intervals, the string of values is segmented in half; the value C starts a new segment or
phrase. In its simplest form, using only one musical dimension, the algorithm works by going
through the dimension's list of un- directed intervals in threes looking for maximum values and
segmenting accordingly. This results in a series of successive segments (or phrases). We can
then average the values in each of the output segments to get a series of new higher- order
values. We input these into the algorithm to produce a second-order segmentation and so forth,

-845-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
until the music is parsed into a single segment. To illustrate the Tenney/Polansky Algorithm,
we perform it on the Schoenberg piece. Example 21a shows the results using one dimension -pitch
alone. The first pass segments the pitches into segments of three to six pitches; that is, the
seg- mentation is determined by the sequence of the sizes of suc- cessive unordered pitch
intervals. The segmental boundaries In essence, the algorithm looks at a string of intervals
derived from the successive values in some musical dimension in a piece of music. The string
might be a series of pitch intervals, time intervals (delays), dynamic changes, and so forth.
More than one string can be selected for analysis. Then the algorithm combines the values of
each dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ P

olansky algorithm: For four successive dimension values labeled A through D forming three
successive unordered intervals labeled X, Y, and Z, if the middle interval is greater than the
other two intervals, the string of values is segmented in half; the value C starts a new
segment or phrase. In its simplest form, using only one musical dimension, the algorithm works
by going through the dimension's list of un- directed intervals in threes looking for maximum
values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then average the values in each of the output segments to get a series of new
higher- order values. We input these into the algorithm to produce a second-order segmentation
and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More than one string can be selected for analysis.
Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the

-846-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- or

der values. We input these into the algorithm to produce a second-order segmentation and so
forth, until the music is parsed into a single segment. To illustrate the Tenney/Polansky
Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results using one
dimension -pitch alone. The first pass segments the pitches into segments of three to six
pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc- cessive
unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at a string
of intervals derived from the successive values in some musical dimension in a piece of music.
The string might be a series of pitch intervals, time intervals (delays), dynamic changes, and
so forth. More than one string can be selected for analysis. Then the algorithm combines the
values of each dimension's suc- cessive intervals according to a user-specified average which
assigns a relative "weight" to each of the dimensions. Example 20 illustrates the principle of
the Tenney/ Polansky algorithm: For four successive dimension values labeled A through D
forming three successive unordered intervals labeled X, Y, and Z, if the middle interval is
greater than the other two intervals, the string of values is segmented in half; the value C
starts a new segment or phrase. In its simplest form, using only one musical dimension, the
algorithm works by going through the dimension's list of un- directed intervals in threes
looking for maximum values and segmenting accordingly. This results in a series of successive
segments (or phrases). We can then average the values in each of the output segments to get a
series of new higher- order values. We input these into the algorithm to produce a second-order
segmentation and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More

than one string can be selected for analysis. Then the algorithm combines the values of each
dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ Polansky algorithm: For four successive dimension values labeled A through D forming
three successive unordered intervals labeled X, Y, and Z, if the middle interval is greater
than the other two intervals, the string of values is segmented in half; the value C starts a
new segment or phrase. In its simplest form, using only one musical dimension, the algorithm
works by going through the dimension's list of un- directed intervals in threes looking for
maximum values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then average the values in each of the output segments to get a series of new
higher- order values. We input these into the algorithm to produce a second-order segmentation

-847-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More than one string can be selected for analysis.
Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimensi

on values labeled A through D forming three successive unordered intervals labeled X, Y, and Z,
if the middle interval is greater than the other two intervals, the string of values is
segmented in half; the value C starts a new segment or phrase. In its simplest form, using only
one musical dimension, the algorithm works by going through the dimension's list of un-
directed intervals in threes looking for maximum values and segmenting accordingly. This
results in a series of successive segments (or phrases). We can then average the values in each
of the output segments to get a series of new higher- order values. We input these into the
algorithm to produce a second-order segmentation and so forth, until the music is parsed into a
single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg
piece. Example 21a shows the results using one dimension -pitch alone. The first pass segments
the pitches into segments of three to six pitches; that is, the seg- mentation is determined by
the sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries
are shown by vertical lines. The results are quite reasonable. For instance, the four pitches
<E, Ft, G, F> in phrase 2 are segmented out of the rest of the measure since they fall in a
lower register from the others. Phrases 4 and 5 seem seg- mented correctly; the first is
divided into two segments, the second into one. And in general, the boundaries of these
first-level segments never contradict our more intuitively de- rived phrase structure. The
second pass works on the aver- ages of the values in each level-1 segment. These averages are
simply the center pitch of the bandwidth (measured in semitones) of each level-1 segment. The
intervals between the series of bandwidths forming level 2 are the input to the second pass of
the algorithm. The resulting second-level seg- mentation divides the piece in half in the
middle of the third phrase, which contradicts our six-phrase structure. That the second-pass
parsing is at variance with our phrase structure is not an embarrassment, for we are taking
pitch intervals as the only criterion for segmentation. Let us ex- amine Example 21b with the
algorithm's taking only time spans between notes as input. Here the unit of time is a
thirty-second note. Once again the first level basically con- forms to our ideas of the phrase
structure, with two excep- tions. Likewise, the second pass partitions the stream of dura-
tions so that it has an exception inherited from level 1; the last phrase is divided in half,
with its first part serving as a conclusion to the second-level segment that starts at phrase
4. Finally, Example 21c shows the algorithm's output using both duration and pitch. The initial
values of the previous examples are simply added together. This time the results get

(1961) The Compositional Matrix. Baldwin, NY: Music Teachers National Assoc.
(1962) Tonal Harmony in Concept and Practice (3rd ed., 1979). New York: Holt, Rinehart and
Winston.

-848-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(1967) SNOBOL3 Primer: An Introduction to the Computer Programming Language. Cambridge, MA: MIT
Press.
(1973) The Structure of Atonal Music. New Haven: Yale Univ. Press.
(1978) The Harmonic Organization of The Rite of Spring. New Haven: Yale Univ. Press.
(1982) Introduction to Schenkerian Analysis (with Steven E. Gilbert). New York: W. W. Norton.
(1995) The American Popular Ballad of the Golden Era: 1924-1950. Princeton: Princeton Univ.
Press.
(1998) The Atonal Music of Anton Webern. New Haven: Yale Univ. Press.
(2001) Listening to Classic American Popular Songs. New Haven: Yale Univ. Press.
See also[edit]
Forte number
References[edit]
Jump up ^ "In memoriam Allen Forte, music theorist". news.yale.edu. October 17, 2014.
Jump up ^ http://news.yale.edu/2014/10/17/memoriam-allen-forte-music-theorist
Jump up ^ Allen Forte, "Secrets of Melody: Line and Design in the Songs of Cole Porter,"
Musical Quarterly 77/4 (1993), unnumbered note on 644-645.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 8.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 9-10; and Berry, "Our Festschrift for Allen: An Introduction and
Conclusion," in A Music-Theoretical Matrix: Essays in Honor of Allen Forte (Part V), ed. David
Carson Berry, Gamut 6/2 (2013), 3.
Jump up ^ Allen Forte, "A Theory of Set-Complexes for Music," Journal of Music Theory 8/2
(1964): 136-183.
Jump up ^ David Carson Berry, "Theory," sect. 5.iv (Pitch-class set theory), in The Grove
Dictionary of American Music, 2nd edition, ed. Charles Hiroshi Garrett (New York: Oxford
University Press, 2013), 8:175-176.
External links[edit]

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance
ry Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A

-849-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional

-850-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

-851-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-852-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-853-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-854-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-855-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-856-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions

-857-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be

-858-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be

-859-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be

-860-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-861-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-862-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith

-863-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds

-864-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-865-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-866-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

-867-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-868-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-869-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand

-870-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the

-871-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the

-872-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the

-873-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be

-874-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be

-875-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be

-876-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a

-877-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated

-878-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated

-879-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds

-880-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-881-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-882-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl

-883-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-884-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-885-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-886-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the

-887-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the

-888-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the

-889-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a

-890-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a

-891-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a

-892-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a

-893-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated

-894-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated

-895-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir

-896-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

-897-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the

-898-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the

-899-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-900-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-901-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-902-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
Bobs Atonal Theory Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show the

-903-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
complement of A by A
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s

-904-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated

-905-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated

-906-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated

-907-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-908-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-909-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-910-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)

-911-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-912-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-913-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by

-914-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the

-915-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the

-916-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor

-917-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action

-918-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a

-919-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a

-920-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated

-921-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated

-922-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated

-923-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)

-924-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the

-925-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the

-926-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the

-927-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-928-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-929-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two

-930-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...

-931-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time

-932-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time

-933-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action

-934-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a

-935-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a

-936-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe
Forte is well known for his book The Structure of Atonal Music (1973), which traces many of its
roots to an article of a decade earlier: "A Theory of Set-Complexes for Music" (1964).[6] In
these works, he "applied set-theoretic principles to the analysis of unordered collections of
pitch classes, called pitch-class sets (pc sets). [...] The basic goal of Forte's theory was to
define the various relationships that existed among the relevant sets of a work, so that
contextual coherence could be demonstrated." Although the methodology derived from Fortes work
"has had its detractors ... textbooks on post-tonal analysis now routinely teach it (to varying
degrees)."[7]

Forte published analyses of the works of Webern and Berg and wrote about Schenkerian analysis
and music of the Great American Songbook. A complete, annotated bibliography of his
publications appears in the previously cited article, Berry, "The Twin Legacies of a
Scholar-Teacher." Excluding items only edited by Forte, it lists ten books, sixty-three
articles, and thirty-six other types publications, from 1955 through early 2009

Forte was also the editor of the Journal of Music Theory during an important period in its
development, from volume 4/2 (1960) through 11/1 (1967). His involvement with the journal,
including many biographical details, is addressed in David Carson Berry, "Journal of Music
Theory under Allen Forte's Editorship," Journal of Music Theory 50/1 (2006): 7-23.

Honors and Awards[edit]


He has been honored by two Festschriften (homage volumes). The first, in commemoration of his

-937-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
seventieth birthday, was published in 1997 and edited by his former students James M. Baker,
David W. Beach, and Jonathan W. Bernard (FA12, FA6, and FA11, according to Berry's list). It
was titled Music Theory in Concept and Practice (a title derived from Forte's 1962
undergraduate textbook, Tonal Harmony in Concept and Practice). The second was serialized in
five installments of Gamut: The Journal of the Music Theory Society of the Mid-Atlantic,
between 2009 and 2013. It was edited by Forte's former student David Carson Berry (FA72) and
was titled A Music-Theoretical Matrix: Essays in Honor of Allen Forte (a title derived from
Forte's 1961 monograph, A Compositional Matrix). It included twenty-two articles by Forte's
former doctoral advisees, and three special features: a previously unpublished article by
Forte, on Gershwin songs; a collection of tributes and reminiscences from forty-two of his
former advisees; and an annotated register of his publications and advisees.

Personal life[edit]
Forte was married to the French-born pianist Madeleine (Hsu) Forte, emerita professor of piano
at Boise State University.

Bibliography (Books only)[edit]


(1955) Contemporary Tone-Structures. New York: Bureau of Publications, Columbia Univ. Teachers
College.
note isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale

-938-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-939-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-940-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and

-941-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).

-942-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).

-943-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work

-944-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a

-945-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-946-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-947-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-948-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-949-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-950-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale

-951-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not

-952-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not

-953-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not

-954-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-955-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-956-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
Forte is well known for his book The Str
Personal life[edit]
Forte was married to the French-born pianist Madeleine (Hsu) Forte, emerita professor of piano

-957-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
at Boise State University.

Bibliography (Books only)[edit]


(1955) Contemporary Tone-Structures. New York: Bureau of Publications, Columbia Univ. Teachers
College.
(1961) The Compositional Matrix. Baldwin, NY: Music Teachers National Assoc.
(1962) Tonal Harmony in Concept and Practice (3rd ed., 1979). New York: Holt, Rinehart and
Winston.
(1967) SNOBOL3 Primer: An Introduction to the Computer Programming Language. Cambridge, MA: MIT
Press.
(1973) The Structure of Atonal Music. New Haven: Yale Univ. Press.
(1978) The Harmonic Organization of The Rite of Spring. New Haven: Yale Univ. Press.
(1982) Introduction to Schenkerian Analysis (with Steven E. Gilbert). New York: W. W. Norton.
(1995) The American Popular Ballad of the Golden Era: 1924-1950. Princeton: Princeton Univ.
Press.
(1998) The Atonal Music of Anton Webern. New Haven: Yale Univ. Press.
(2001) Listening to Classic American Popular Songs. New Haven: Yale Univ. Press.
See also[edit]
Forte number
References[edit]
Jump up ^ "In memoriam Allen Forte, music theorist". news.yale.edu. October 17, 2014.
Jump up ^ http://news.yale.edu/2014/10/17/memoriam-allen-forte-music-theorist
Jump up ^ Allen Forte, "Secrets of Melody: Line and Design in the Songs of Cole Porter,"
Musical Quarterly 77/4 (1993), unnumbered note on 644-645.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 8.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 9-10; and Berry, "Our Festschrift for Allen: An Introduction and
Conclusion," in A Music-Theoretical Matrix: Essays in Honor of Allen Forte (Part V), ed. David
Carson Berry, Gamut 6/2 (2013), 3.
Jump up ^ Allen Forte, "A Theory of Set-Complexes for Music," Journal of Music Theory 8/2
(1964): 136-183.
Jump up ^ David Carson Berry, "Theory," sect. 5.iv (Pitch-class set theory), in The Grove
Dictionary of American Music, 2nd edition, ed. Charles Hiroshi Garrett (New York: Oxford
University Press, 2013), 8:175-176.
External links[edit]

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance
ry Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the

-958-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They

-959-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for

-960-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

httpsructure of Atonal Music (1973), which traces many of its roots to an article of a decade
earlier: "A Theory of Set-Complexes for Music" (1964).[6] In these works, he "applied
set-theoretic principles to the analysis of unordered collections of pitch classes, called
pitch-class sets (pc sets). [...] The basic goal of Forte's theory was to define the various
relationships that existed among the relevant sets of a work, so that contextual coherence
could be demonstrated." Although the methodology derived from Fortes work "has had its
detractors ... textbooks on post-tonal analysis now routinely teach it (to varying degrees)."[7]

Forte published analyses of the works of Webern and Berg and wrote about Schenkerian analysis
and music of the Great American Songbook. A complete, annotated bibliography of his
publications appears in the previously cited article, Berry, "The Twin Legacies of a
Scholar-Teacher." Excluding items only edited by Forte, it lists ten books, sixty-three
articles, and thirty-six other types publications, from 1955 through early 2009

Forte was also the editor of the Journal of Music Theory during an important period in its
development, from volume 4/2 (1960) through 11/1 (1967). His involvement with the journal,
including many biographical details, is addressed in David Carson Berry, "Journal of Music
Theory under Allen Forte's Editorship," Journal of Music Theory 50/1 (2006): 7-23.

Honors and Awards[edit]


He has been honored by two Festschriften (homage volumes). The first, in commemoration of his
seventieth birthday, was published in 1997 and edited by his former students James M. Baker,
David W. Beach, and Jonathan W. Bernard (FA12, FA6, and FA11, according to Berry's list). It
was titled Music Theory in Concept and Practice (a title derived from Forte's 1962
undergraduate textbook, Tonal Harmony in Concept and Practice). The second was serialized in
five installments of Gamut: The Journal of the Music Theory Society of the Mid-Atlantic,
between 2009 and 2013. It was edited by Forte's former student David Carson Berry (FA72) and
was titled A Music-Theoretical Matrix: Essays in Honor of Allen Forte (a title derived from
Forte's 1961 monograph, A Compositional Matrix). It included twenty-two articles by Forte's
former doctoral advisees, and three special features: a previously unpublished article by
Forte, on Gershwin songs; a collection of tributes and reminiscences from forty-two of his
former advisees; and an annotated register of his publications and advisees.
specified average which assigns a relative "weight" to each of the dimensions. Example 20
illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successivead the 00
Kernel Debug Guide
https://samsclass.info/126/proj/p12-kernel-debug-win10.htm
- Installing Debugging Tools for Windows
01 WIN SDK
-- Use the Windows SDK, select only the components you want to install
-- in this case, "debugging tools for windows"
- set up local kernel mode debug:
bcdedit /debug on
bcdedit /dbgsettings local
(need to reset)

-961-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

02 LiveKD -
https://technet.microsoft.com/en-us/sysinternals/bb897415.aspx
If you install the tools to their default directory of \Program Files\Microsoft\Debugging Tools
for Windows, you can run LiveKD from any directory; otherwise you should copy LiveKD to the
directory in which the tools are installed
170414 - installed on drive d: - copy livekd64.exe to
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64

when prompted for sym dir, enter:


D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
srv*D:\Program Files (x86)\Windows
Kits\10\Debuggers\x64\livekd64_Symbols*http://msdl.microsoft.com/download/symbols

side issue : symbol store


https://www.howtogeek.com/236195/how-to-find-out-which-build-and-version-of-windows-10-you-have/
alt-I -> about
command line:
cmd> systeminfo
cmd> ver
cmd> winver
cmd> wmic os get buildnumber,caption,CSDVersion /format:csv
cmd> wmic os get /value

other sites:
https://blogs.technet.microsoft.com/markrussinovich/2005/08/17/unkillable-processes/
https://www.windows-commandline.com/find-windows-os-version-from-command/
http://stackoverflow.com/questions/30159714/i-want-to-know-what-wmic-os-get-name-version-csdversi
on-command-returns-for-w

Debugger reference->debugger commands->Kernel-mode extension commands


https://msdn.microsoft.com/en-us/library/windows/hardware/ff564717(v=vs.85).aspx
!process

https://msdn.microsoft.com/en-us/library/windows/hardware/ff563812(v=vs.85).aspx
!irp

process explorer: configure symbols


Configure Symbols: on Windows NT and higher, if you want Process Explorer to resolve addresses
for thread start addresses in the threads tab of the process properties dialog and the thread
stack window then configure symbols by first downloading the Debugging Tools for Windows
package from Microsoft's web site and installing it in its default directory. Open the
Configure Symbols dialog and specify the path to the dbghelp.dll that's in the Debugging Tools
directory and have the symbol engine download symbols on demand from Microsoft to a directory
on your disk by entering a symbol server string for the symbol path. For example, to have
symbols download to the c:\symbols directory you would enter this string:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

https://blogs.msdn.microsoft.com/usbcoreblog/2009/10/06/why-doesnt-my-driver-unload/

running as SYSTEM:
https://forum.sysinternals.com/system-process-using-all-cpu_topic12233.html

-962-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
cd C:\Users\dan\Desktop\kernel debug\02 LiveKD\PSTools
psexec -s -i -d "C:\Users\dan\Desktop\kernel debug\02 LiveKD\ProcessExplorer\procexp64.exe
psexec -s -i -d "C:\Users\dan\Downloads\processhacker-2.39-bin\x64\ProcessHacker.exe"
to configure symbols in procexp64 or processhacker:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

cd D:\Program Files (x86)\Windows Kits\10\Debuggers\x64


d:
livekd64 -vsym
y
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
!process 0 1 notmyfault64.exe
!process 0 7 notmyfault64.exe
!irp ffff980e687e8310

lected for analysis.


Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- order values. We input these into
the algorithm to produce a second-order segmentation and so forth, until the music is parsed
into a single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the
Schoenberg piece. Example 21a shows the results using one dimension -pitch alone. The first
pass segments the pitches into segments of three to six pitches; that is, the seg- mentation is
determined by the sequence of the sizes of suc- cessive unordered pitch intervals. The
segmental boundaries In essence, the algorithm looks at a string of intervals derived from the
successive values in some musical dimension in a piece of music. The string might be a series
of pitch intervals, time intervals (dEIn essence, the algorithm looks at a string of intervals
derived from the successive values in some musical dimension in a piece of music. The string
might be a series of pitch intervals, time intervals (delays), dynamic changes, and so forth.
More than one string can be selected for analysis. Then the algorithm combines the values of
each dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ Polansky algorithm: For four successive dimension values labeled A through D forming
three successive unordered intervals labeled X, Y, and Z, if the middle interval is greater
than the other two intervals, the string of values is segmented in half; the value C starts a
new segment or phrase. In its simplest form, using only one musical dimension, the algorithm
works by going through the dimension's list of un- directed intervals in threes looking for
maximum values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then xample 18. Primes enclosed in rectangles Example 18. Primes enclosed in
rectangles Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in rectangles

-963-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18.
Primes enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18. Primes
enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in
rectangles
<0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <1 0 3
2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <
131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131
402> < 131 402> < 131 402>

read the 00 Kernel Debug Guide


https://samsclass.info/126/proj/p12-kernel-debug-win10.htm
- Installing Debugging Tools for Windows
01 WIN SDK
-- Use the Windows SDK, select only the components you want to install
-- in this case, "debugging tools for windows"
- set up local kernel mode debug:
bcdedit /debug on
bcdedit /dbgsettings local
(need to reset)

02 LiveKD -
https://technet.microsoft.com/en-us/sysinternals/bb897415.aspx
If you install the tools to their default directory of \Program Files\Microsoft\Debugging Tools
for Windows, you can run LiveKD from any directory; otherwise you should copy LiveKD to the
directory in which the tools are installed
170414 - installed on drive d: - copy livekd64.exe to
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64

when prompted for sym dir, enter:


D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
srv*D:\Program Files (x86)\Windows
Kits\10\Debuggers\x64\livekd64_Symbols*http://msdl.microsoft.com/download/symbols

side issue : symbol store


https://www.howtogeek.com/236195/how-to-find-out-which-build-and-version-of-windows-10-you-have/
alt-I -> about
command line:
cmd> systeminfo
cmd> ver
cmd> winver
cmd> wmic os get buildnumber,caption,CSDVersion /format:csv
cmd> wmic os get /value

other sites:
https://blogs.technet.microsoft.com/markrussinovich/2005/08/17/unkillable-processes/
https://www.windows-commandline.com/find-windows-os-version-from-command/
http://stackoverflow.com/questions/30159714/i-want-to-know-what-wmic-os-get-name-version-csdversi
on-command-returns-for-w

Debugger reference->debugger commands->Kernel-mode extension commands


https://msdn.microsoft.com/en-us/library/windows/hardware/ff564717(v=vs.85).aspx
!process

https://msdn.microsoft.com/en-us/library/windows/hardware/ff563812(v=vs.85).aspx
!irp

process explorer: configure symbols


Configure Symbols: on Windows NT and higher, if you want Process Explorer to resolve addresses
for thread start addresses in the threads tab of the process properties dialog and the thread
stack window then configure symbols by first downloading the Debugging Tools for Windows
package from Microsoft's web site and installing it in its default directory. Open the

-964-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Configure Symbols dialog and specify the path to the dbghelp.dll that's in the Debugging Tools
directory and have the symbol engine download symbols on demand from Microsoft to a directory
on your disk by entering a symbol server string for the symbol path. For example, to have
symbols download to the c:\symbols directory you would enter this string:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

https://blogs.msdn.microsoft.com/usbcoreblog/2009/10/06/why-doesnt-my-driver-unload/

running as SYSTEM:
https://forum.sysinternals.com/system-process-using-all-cpu_topic12233.html

cd C:\Users\dan\Desktop\kernel debug\02 LiveKD\PSTools


psexec -s -i -d "C:\Users\dan\Desktop\kernel debug\02 LiveKD\ProcessExplorer\procexp64.exe
psexec -s -i -d "C:\Users\dan\Downloads\processhacker-2.39-bin\x64\ProcessHacker.exe"
to configure symbols in procexp64 or processhacker:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

cd D:\Program Files (x86)\Windows Kits\10\Debuggers\x64


d:
livekd64 -vsym
y
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
!process 0 1 notmyfault64.exe
!process 0 7 notmyfault64.exe
!irp ffff980e687e8310

In essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- order values. We input these into
the algorithm to produce a second-order segmentation and so forth, until the music is parsed
into a single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the
Schoenberg piece. Example 21a shows the results using one dimension -pitch alone. The first
pass segments the pitches into segments of three to six pitches; that is, the seg- mentation is
determined by the sequence of the sizes of suc- cessive unordered pitch intervals. The
segmental boundaries In essence, the algorithm looks at a string of intervals derived from the
successive values in some musical dimension in a piece of music. The string might be a series
of pitch intervals, time intervals (delays), dynamic changes, and so forth. More than one
string can be selected for analysis. Then the algorithm combines the values of each dimension's
suc- cessive intervals according to a user-specified average which assigns a relative "weight"
to each of the dimensions. Example 20 illustrates the principle of the Tenney/ Polansky
algorithm: For four successive dimension values labeled A through D forming three successive
unordered intervals labeled X, Y, and Z, if the middle interval is greater than the other two
intervals, the string of values is segmented in half; the value C starts a new segment or
phrase. In its simplest form, using only one musical dimension, the algorithm works by going
through the dimension's list of un- directed intervals in threes looking for maximum values and
segmenting accordingly. This results in a series of successive segments (or phrases). We can
then average the values in each of the output segments to get a series of new higher- order

-965-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
values. We input these into the algorithm to produce a second-order segmentation and so forth,
until the music is parsed into a single segment. To illustrate the Tenney/Polansky Algorithm,
we perform it on the Schoenberg piece. Example 21a shows the results using one dimension -pitch
alone. The first pass segments the pitches into segments of three to six pitches; that is, the
seg- mentation is determined by the sequence of the sizes of suc- cessive unordered pitch
intervals. The segmental boundaries In essence, the algorithm looks at a string of intervals
derived from the successive values in some musical dimension in a piece of music. The string
might be a series of pitch intervals, time intervals (delays), dynamic changes, and so forth.
More than one string can be selected for analysis. Then the algorithm combines the values of
each dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ P

olansky algorithm: For four successive dimension values labeled A through D forming three
successive unordered intervals labeled X, Y, and Z, if the middle interval is greater than the
other two intervals, the string of values is segmented in half; the value C starts a new
segment or phrase. In its simplest form, using only one musical dimension, the algorithm works
by going through the dimension's list of un- directed intervals in threes looking for maximum
values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then average the values in each of the output segments to get a series of new
higher- order values. We input these into the algorithm to produce a second-order segmentation
and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More than one string can be selected for analysis.
Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals

-966-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- or

der values. We input these into the algorithm to produce a second-order segmentation and so
forth, until the music is parsed into a single segment. To illustrate the Tenney/Polansky
Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results using one
dimension -pitch alone. The first pass segments the pitches into segments of three to six
pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc- cessive
unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at a string
of intervals derived from the successive values in some musical dimension in a piece of music.
The string might be a series of pitch intervals, time intervals (delays), dynamic changes, and
so forth. More than one string can be selected for analysis. Then the algorithm combines the
values of each dimension's suc- cessive intervals according to a user-specified average which
assigns a relative "weight" to each of the dimensions. Example 20 illustrates the principle of
the Tenney/ Polansky algorithm: For four successive dimension values labeled A through D
forming three successive unordered intervals labeled X, Y, and Z, if the middle interval is
greater than the other two intervals, the string of values is segmented in half; the value C
starts a new segment or phrase. In its simplest form, using only one musical dimension, the
algorithm works by going through the dimension's list of un- directed intervals in threes
looking for maximum values and segmenting accordingly. This results in a series of successive
segments (or phrases). We can then average the values in each of the output segments to get a
series of new higher- order values. We input these into the algorithm to produce a second-order
segmentation and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More

than one string can be selected for analysis. Then the algorithm combines the values of each
dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ Polansky algorithm: For four successive dimension values labeled A through D forming
three successive unordered intervals labeled X, Y, and Z, if the middle interval is greater
than the other two intervals, the string of values is segmented in half; the value C starts a
new segment or phrase. In its simplest form, using only one musical dimension, the algorithm
works by going through the dimension's list of un- directed intervals in threes looking for
maximum values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then average the values in each of the output segments to get a series of new

-967-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
higher- order values. We input these into the algorithm to produce a second-order segmentation
and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More than one string can be selected for analysis.
Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimensi

on values labeled A through D forming three successive unordered intervals labeled X, Y, and Z,
if the middle interval is greater than the other two intervals, the string of values is
segmented in half; the value C starts a new segment or phrase. In its simplest form, using only
one musical dimension, the algorithm works by going through the dimension's list of un-
directed intervals in threes looking for maximum values and segmenting accordingly. This
results in a series of successive segments (or phrases). We can then average the values in each
of the output segments to get a series of new higher- order values. We input these into the
algorithm to produce a second-order segmentation and so forth, until the music is parsed into a
single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg
piece. Example 21a shows the results using one dimension -pitch alone. The first pass segments
the pitches into segments of three to six pitches; that is, the seg- mentation is determined by
the sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries
are shown by vertical lines. The results are quite reasonable. For instance, the four pitches
<E, Ft, G, F> in phrase 2 are segmented out of the rest of the measure since they fall in a
lower register from the others. Phrases 4 and 5 seem seg- mented correctly; the first is
divided into two segments, the second into one. And in general, the boundaries of these
first-level segments never contradict our more intuitively de- rived phrase structure. The
second pass works on the aver- ages of the values in each level-1 segment. These averages are
simply the center pitch of the bandwidth (measured in semitones) of each level-1 segment. The
intervals between the series of bandwidths forming level 2 are the input to the second pass of
the algorithm. The resulting second-level seg- mentation divides the piece in half in the
middle of the third phrase, which contradicts our six-phrase structure. That the second-pass
parsing is at variance with our phrase structure is not an embarrassment, for we are taking
pitch intervals as the only criterion for segmentation. Let us ex- amine Example 21b with the
algorithm's taking only time spans between notes as input. Here the unit of time is a
thirty-second note. Once again the first level basically con- forms to our ideas of the phrase
structure, with two excep- tions. Likewise, the second pass partitions the stream of dura-
tions so that it has an exception inherited from level 1; the last phrase is divided in half,
with its first part serving as a conclusion to the second-level segment that starts at phrase
4. Finally, Example 21c shows the algorithm's output using both duration and pitch. The initial
values of the previous examples are simply added together. This time the results get

(1961) The Compositional Matrix. Baldwin, NY: Music Teachers National Assoc.
(1962) Tonal Harmony in Concept and Practice (3rd ed., 1979). New York: Holt, Rinehart and

-968-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Winston.
(1967) SNOBOL3 Primer: An Introduction to the Computer Programming Language. Cambridge, MA: MIT
Press.
(1973) The Structure of Atonal Music. New Haven: Yale Univ. Press.
(1978) The Harmonic Organization of The Rite of Spring. New Haven: Yale Univ. Press.
(1982) Introduction to Schenkerian Analysis (with Steven E. Gilbert). New York: W. W. Norton.
(1995) The American Popular Ballad of the Golden Era: 1924-1950. Princeton: Princeton Univ.
Press.
(1998) The Atonal Music of Anton Webern. New Haven: Yale Univ. Press.
(2001) Listening to Classic American Popular Songs. New Haven: Yale Univ. Press.
See also[edit]
Forte number
References[edit]
Jump up ^ "In memoriam Allen Forte, music theorist". news.yale.edu. October 17, 2014.
Jump up ^ http://news.yale.edu/2014/10/17/memoriam-allen-forte-music-theorist
Jump up ^ Allen Forte, "Secrets of Melody: Line and Design in the Songs of Cole Porter,"
Musical Quarterly 77/4 (1993), unnumbered note on 644-645.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 8.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 9-10; and Berry, "Our Festschrift for Allen: An Introduction and
Conclusion," in A Music-Theoretical Matrix: Essays in Honor of Allen Forte (Part V), ed. David
Carson Berry, Gamut 6/2 (2013), 3.
Jump up ^ Allen Forte, "A Theory of Set-Complexes for Music," Journal of Music Theory 8/2
(1964): 136-183.
Jump up ^ David Carson Berry, "Theory," sect. 5.iv (Pitch-class set theory), in The Grove
Dictionary of American Music, 2nd edition, ed. Charles Hiroshi Garrett (New York: Oxford
University Press, 2013), 8:175-176.
External links[edit]

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance
ry Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?

-969-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for

-970-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

-971-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-972-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-973-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-974-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-975-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-976-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...

-977-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the

-978-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the

-979-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the

-980-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-981-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-982-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As

-983-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch

-984-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-985-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-986-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

-987-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-988-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-989-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has

-990-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829

-991-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829

-992-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829

-993-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the

-994-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the

-995-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the

-996-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,

-997-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-998-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-999-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch

-1000-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-1001-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-1002-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between

-1003-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-1004-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-1005-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-1006-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829

-1007-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829

-1008-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829

-1009-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,

-1010-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while

-1011-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while

-1012-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while

-1013-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-1014-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-1015-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note

-1016-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

-1017-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation

-1018-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation

-1019-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-1020-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-1021-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-1022-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
Bobs Atonal Theory Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

-1023-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
.(12) B is the complement of A, if B contains all elements of U not in A. We show the
complement of A by A
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike

-1024-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-1025-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-1026-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-1027-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-1028-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-1029-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-1030-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

-1031-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-1032-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-1033-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are

-1034-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829

-1035-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829

-1036-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch

-1037-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

-1038-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while

-1039-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while

-1040-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-1041-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-1042-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-1043-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

-1044-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation

-1045-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation

-1046-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation

-1047-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-1048-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-1049-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be

-1050-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.

-1051-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known

-1052-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known

-1053-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

-1054-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while

-1055-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while

-1056-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe
Forte is well known for his book The Structure of Atonal Music (1973), which traces many of its
roots to an article of a decade earlier: "A Theory of Set-Complexes for Music" (1964).[6] In
these works, he "applied set-theoretic principles to the analysis of unordered collections of
pitch classes, called pitch-class sets (pc sets). [...] The basic goal of Forte's theory was to
define the various relationships that existed among the relevant sets of a work, so that
contextual coherence could be demonstrated." Although the methodology derived from Fortes work
"has had its detractors ... textbooks on post-tonal analysis now routinely teach it (to varying
degrees)."[7]

Forte published analyses of the works of Webern and Berg and wrote about Schenkerian analysis
and music of the Great American Songbook. A complete, annotated bibliography of his
publications appears in the previously cited article, Berry, "The Twin Legacies of a
Scholar-Teacher." Excluding items only edited by Forte, it lists ten books, sixty-three
articles, and thirty-six other types publications, from 1955 through early 2009

Forte was also the editor of the Journal of Music Theory during an important period in its
development, from volume 4/2 (1960) through 11/1 (1967). His involvement with the journal,
including many biographical details, is addressed in David Carson Berry, "Journal of Music
Theory under Allen Forte's Editorship," Journal of Music Theory 50/1 (2006): 7-23.

Honors and Awards[edit]

-1057-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
He has been honored by two Festschriften (homage volumes). The first, in commemoration of his
seventieth birthday, was published in 1997 and edited by his former students James M. Baker,
David W. Beach, and Jonathan W. Bernard (FA12, FA6, and FA11, according to Berry's list). It
was titled Music Theory in Concept and Practice (a title derived from Forte's 1962
undergraduate textbook, Tonal Harmony in Concept and Practice). The second was serialized in
five installments of Gamut: The Journal of the Music Theory Society of the Mid-Atlantic,
between 2009 and 2013. It was edited by Forte's former student David Carson Berry (FA72) and
was titled A Music-Theoretical Matrix: Essays in Honor of Allen Forte (a title derived from
Forte's 1961 monograph, A Compositional Matrix). It included twenty-two articles by Forte's
former doctoral advisees, and three special features: a previously unpublished article by
Forte, on Gershwin songs; a collection of tributes and reminiscences from forty-two of his
former advisees; and an annotated register of his publications and advisees.

Personal life[edit]
Forte was married to the French-born pianist Madeleine (Hsu) Forte, emerita professor of piano
at Boise State University.

Bibliography (Books only)[edit]


(1955) Contemporary Tone-Structures. New York: Bureau of Publications, Columbia Univ. Teachers
College.
note isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)

-1058-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-1059-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-1060-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by

-1061-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the

-1062-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the

-1063-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor

-1064-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action

-1065-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a

-1066-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a

-1067-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated

-1068-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated

-1069-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated

-1070-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)

-1071-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the

-1072-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the

-1073-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the

-1074-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-1075-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-1076-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
Forte is well known for his book The Str
Personal life[edit]

-1077-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Forte was married to the French-born pianist Madeleine (Hsu) Forte, emerita professor of piano
at Boise State University.

Bibliography (Books only)[edit]


(1955) Contemporary Tone-Structures. New York: Bureau of Publications, Columbia Univ. Teachers
College.
(1961) The Compositional Matrix. Baldwin, NY: Music Teachers National Assoc.
(1962) Tonal Harmony in Concept and Practice (3rd ed., 1979). New York: Holt, Rinehart and
Winston.
(1967) SNOBOL3 Primer: An Introduction to the Computer Programming Language. Cambridge, MA: MIT
Press.
(1973) The Structure of Atonal Music. New Haven: Yale Univ. Press.
(1978) The Harmonic Organization of The Rite of Spring. New Haven: Yale Univ. Press.
(1982) Introduction to Schenkerian Analysis (with Steven E. Gilbert). New York: W. W. Norton.
(1995) The American Popular Ballad of the Golden Era: 1924-1950. Princeton: Princeton Univ.
Press.
(1998) The Atonal Music of Anton Webern. New Haven: Yale Univ. Press.
(2001) Listening to Classic American Popular Songs. New Haven: Yale Univ. Press.
See also[edit]
Forte number
References[edit]
Jump up ^ "In memoriam Allen Forte, music theorist". news.yale.edu. October 17, 2014.
Jump up ^ http://news.yale.edu/2014/10/17/memoriam-allen-forte-music-theorist
Jump up ^ Allen Forte, "Secrets of Melody: Line and Design in the Songs of Cole Porter,"
Musical Quarterly 77/4 (1993), unnumbered note on 644-645.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 8.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 9-10; and Berry, "Our Festschrift for Allen: An Introduction and
Conclusion," in A Music-Theoretical Matrix: Essays in Honor of Allen Forte (Part V), ed. David
Carson Berry, Gamut 6/2 (2013), 3.
Jump up ^ Allen Forte, "A Theory of Set-Complexes for Music," Journal of Music Theory 8/2
(1964): 136-183.
Jump up ^ David Carson Berry, "Theory," sect. 5.iv (Pitch-class set theory), in The Grove
Dictionary of American Music, 2nd edition, ed. Charles Hiroshi Garrett (New York: Oxford
University Press, 2013), 8:175-176.
External links[edit]

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance
ry Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,

-1078-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir

-1079-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

-1080-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

httpsructure of Atonal Music (1973), which traces many of its roots to an article of a decade
earlier: "A Theory of Set-Complexes for Music" (1964).[6] In these works, he "applied
set-theoretic principles to the analysis of unordered collections of pitch classes, called
pitch-class sets (pc sets). [...] The basic goal of Forte's theory was to define the various
relationships that existed among the relevant sets of a work, so that contextual coherence
could be demonstrated." Although the methodology derived from Fortes work "has had its
detractors ... textbooks on post-tonal analysis now routinely teach it (to varying degrees)."[7]

Forte published analyses of the works of Webern and Berg and wrote about Schenkerian analysis
and music of the Great American Songbook. A complete, annotated bibliography of his
publications appears in the previously cited article, Berry, "The Twin Legacies of a
Scholar-Teacher." Excluding items only edited by Forte, it lists ten books, sixty-three
articles, and thirty-six other types publications, from 1955 through early 2009

Forte was also the editor of the Journal of Music Theory during an important period in its
development, from volume 4/2 (1960) through 11/1 (1967). His involvement with the journal,
including many biographical details, is addressed in David Carson Berry, "Journal of Music
Theory under Allen Forte's Editorship," Journal of Music Theory 50/1 (2006): 7-23.

Honors and Awards[edit]


He has been honored by two Festschriften (homage volumes). The first, in commemoration of his
seventieth birthday, was published in 1997 and edited by his former students James M. Baker,
David W. Beach, and Jonathan W. Bernard (FA12, FA6, and FA11, according to Berry's list). It
was titled Music Theory in Concept and Practice (a title derived from Forte's 1962
undergraduate textbook, Tonal Harmony in Concept and Practice). The second was serialized in
five installments of Gamut: The Journal of the Music Theory Society of the Mid-Atlantic,
between 2009 and 2013. It was edited by Forte's former student David Carson Berry (FA72) and
was titled A Music-Theoretical Matrix: Essays in Honor of Allen Forte (a title derived from
Forte's 1961 monograph, A Compositional Matrix). It included twenty-two articles by Forte's
former doctoral advisees, and three special features: a previously unpublished article by
Forte, on Gershwin songs; a collection of tributes and reminiscences from forty-two of his
former advisees; and an annotated register of his publications and advisees.
specified average which assigns a relative "weight" to each of the dimensions. Example 20
illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successivead the 00
Kernel Debug Guide
https://samsclass.info/126/proj/p12-kernel-debug-win10.htm
- Installing Debugging Tools for Windows
01 WIN SDK
-- Use the Windows SDK, select only the components you want to install
-- in this case, "debugging tools for windows"
- set up local kernel mode debug:
bcdedit /debug on
bcdedit /dbgsettings local

-1081-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(need to reset)

02 LiveKD -
https://technet.microsoft.com/en-us/sysinternals/bb897415.aspx
If you install the tools to their default directory of \Program Files\Microsoft\Debugging Tools
for Windows, you can run LiveKD from any directory; otherwise you should copy LiveKD to the
directory in which the tools are installed
170414 - installed on drive d: - copy livekd64.exe to
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64

when prompted for sym dir, enter:


D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
srv*D:\Program Files (x86)\Windows
Kits\10\Debuggers\x64\livekd64_Symbols*http://msdl.microsoft.com/download/symbols

side issue : symbol store


https://www.howtogeek.com/236195/how-to-find-out-which-build-and-version-of-windows-10-you-have/
alt-I -> about
command line:
cmd> systeminfo
cmd> ver
cmd> winver
cmd> wmic os get buildnumber,caption,CSDVersion /format:csv
cmd> wmic os get /value

other sites:
https://blogs.technet.microsoft.com/markrussinovich/2005/08/17/unkillable-processes/
https://www.windows-commandline.com/find-windows-os-version-from-command/
http://stackoverflow.com/questions/30159714/i-want-to-know-what-wmic-os-get-name-version-csdversi
on-command-returns-for-w

Debugger reference->debugger commands->Kernel-mode extension commands


https://msdn.microsoft.com/en-us/library/windows/hardware/ff564717(v=vs.85).aspx
!process

https://msdn.microsoft.com/en-us/library/windows/hardware/ff563812(v=vs.85).aspx
!irp

process explorer: configure symbols


Configure Symbols: on Windows NT and higher, if you want Process Explorer to resolve addresses
for thread start addresses in the threads tab of the process properties dialog and the thread
stack window then configure symbols by first downloading the Debugging Tools for Windows
package from Microsoft's web site and installing it in its default directory. Open the
Configure Symbols dialog and specify the path to the dbghelp.dll that's in the Debugging Tools
directory and have the symbol engine download symbols on demand from Microsoft to a directory
on your disk by entering a symbol server string for the symbol path. For example, to have
symbols download to the c:\symbols directory you would enter this string:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

https://blogs.msdn.microsoft.com/usbcoreblog/2009/10/06/why-doesnt-my-driver-unload/

running as SYSTEM:
https://forum.sysinternals.com/system-process-using-all-cpu_topic12233.html

-1082-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

cd C:\Users\dan\Desktop\kernel debug\02 LiveKD\PSTools


psexec -s -i -d "C:\Users\dan\Desktop\kernel debug\02 LiveKD\ProcessExplorer\procexp64.exe
psexec -s -i -d "C:\Users\dan\Downloads\processhacker-2.39-bin\x64\ProcessHacker.exe"
to configure symbols in procexp64 or processhacker:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

cd D:\Program Files (x86)\Windows Kits\10\Debuggers\x64


d:
livekd64 -vsym
y
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
!process 0 1 notmyfault64.exe
!process 0 7 notmyfault64.exe
!irp ffff980e687e8310

lected for analysis.


Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- order values. We input these into
the algorithm to produce a second-order segmentation and so forth, until the music is parsed
into a single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the
Schoenberg piece. Example 21a shows the results using one dimension -pitch alone. The first
pass segments the pitches into segments of three to six pitches; that is, the seg- mentation is
determined by the sequence of the sizes of suc- cessive unordered pitch intervals. The
segmental boundaries In essence, the algorithm looks at a string of intervals derived from the
successive values in some musical dimension in a piece of music. The string might be a series
of pitch intervals, time intervals (dEIn essence, the algorithm looks at a string of intervals
derived from the successive values in some musical dimension in a piece of music. The string
might be a series of pitch intervals, time intervals (delays), dynamic changes, and so forth.
More than one string can be selected for analysis. Then the algorithm combines the values of
each dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ Polansky algorithm: For four successive dimension values labeled A through D forming
three successive unordered intervals labeled X, Y, and Z, if the middle interval is greater
than the other two intervals, the string of values is segmented in half; the value C starts a
new segment or phrase. In its simplest form, using only one musical dimension, the algorithm
works by going through the dimension's list of un- directed intervals in threes looking for
maximum values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then xample 18. Primes enclosed in rectangles Example 18. Primes enclosed in

-1083-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
rectangles Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in rectangles
Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18.
Primes enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18. Primes
enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in
rectangles
<0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <1 0 3
2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <
131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131
402> < 131 402> < 131 402>

read the 00 Kernel Debug Guide


https://samsclass.info/126/proj/p12-kernel-debug-win10.htm
- Installing Debugging Tools for Windows
01 WIN SDK
-- Use the Windows SDK, select only the components you want to install
-- in this case, "debugging tools for windows"
- set up local kernel mode debug:
bcdedit /debug on
bcdedit /dbgsettings local
(need to reset)

02 LiveKD -
https://technet.microsoft.com/en-us/sysinternals/bb897415.aspx
If you install the tools to their default directory of \Program Files\Microsoft\Debugging Tools
for Windows, you can run LiveKD from any directory; otherwise you should copy LiveKD to the
directory in which the tools are installed
170414 - installed on drive d: - copy livekd64.exe to
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64

when prompted for sym dir, enter:


D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
srv*D:\Program Files (x86)\Windows
Kits\10\Debuggers\x64\livekd64_Symbols*http://msdl.microsoft.com/download/symbols

side issue : symbol store


https://www.howtogeek.com/236195/how-to-find-out-which-build-and-version-of-windows-10-you-have/
alt-I -> about
command line:
cmd> systeminfo
cmd> ver
cmd> winver
cmd> wmic os get buildnumber,caption,CSDVersion /format:csv
cmd> wmic os get /value

other sites:
https://blogs.technet.microsoft.com/markrussinovich/2005/08/17/unkillable-processes/
https://www.windows-commandline.com/find-windows-os-version-from-command/
http://stackoverflow.com/questions/30159714/i-want-to-know-what-wmic-os-get-name-version-csdversi
on-command-returns-for-w

Debugger reference->debugger commands->Kernel-mode extension commands


https://msdn.microsoft.com/en-us/library/windows/hardware/ff564717(v=vs.85).aspx
!process

https://msdn.microsoft.com/en-us/library/windows/hardware/ff563812(v=vs.85).aspx
!irp

process explorer: configure symbols


Configure Symbols: on Windows NT and higher, if you want Process Explorer to resolve addresses
for thread start addresses in the threads tab of the process properties dialog and the thread
stack window then configure symbols by first downloading the Debugging Tools for Windows

-1084-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
package from Microsoft's web site and installing it in its default directory. Open the
Configure Symbols dialog and specify the path to the dbghelp.dll that's in the Debugging Tools
directory and have the symbol engine download symbols on demand from Microsoft to a directory
on your disk by entering a symbol server string for the symbol path. For example, to have
symbols download to the c:\symbols directory you would enter this string:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

https://blogs.msdn.microsoft.com/usbcoreblog/2009/10/06/why-doesnt-my-driver-unload/

running as SYSTEM:
https://forum.sysinternals.com/system-process-using-all-cpu_topic12233.html

cd C:\Users\dan\Desktop\kernel debug\02 LiveKD\PSTools


psexec -s -i -d "C:\Users\dan\Desktop\kernel debug\02 LiveKD\ProcessExplorer\procexp64.exe
psexec -s -i -d "C:\Users\dan\Downloads\processhacker-2.39-bin\x64\ProcessHacker.exe"
to configure symbols in procexp64 or processhacker:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

cd D:\Program Files (x86)\Windows Kits\10\Debuggers\x64


d:
livekd64 -vsym
y
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
!process 0 1 notmyfault64.exe
!process 0 7 notmyfault64.exe
!irp ffff980e687e8310

In essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- order values. We input these into
the algorithm to produce a second-order segmentation and so forth, until the music is parsed
into a single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the
Schoenberg piece. Example 21a shows the results using one dimension -pitch alone. The first
pass segments the pitches into segments of three to six pitches; that is, the seg- mentation is
determined by the sequence of the sizes of suc- cessive unordered pitch intervals. The
segmental boundaries In essence, the algorithm looks at a string of intervals derived from the
successive values in some musical dimension in a piece of music. The string might be a series
of pitch intervals, time intervals (delays), dynamic changes, and so forth. More than one
string can be selected for analysis. Then the algorithm combines the values of each dimension's
suc- cessive intervals according to a user-specified average which assigns a relative "weight"
to each of the dimensions. Example 20 illustrates the principle of the Tenney/ Polansky
algorithm: For four successive dimension values labeled A through D forming three successive
unordered intervals labeled X, Y, and Z, if the middle interval is greater than the other two
intervals, the string of values is segmented in half; the value C starts a new segment or
phrase. In its simplest form, using only one musical dimension, the algorithm works by going
through the dimension's list of un- directed intervals in threes looking for maximum values and
segmenting accordingly. This results in a series of successive segments (or phrases). We can

-1085-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
then average the values in each of the output segments to get a series of new higher- order
values. We input these into the algorithm to produce a second-order segmentation and so forth,
until the music is parsed into a single segment. To illustrate the Tenney/Polansky Algorithm,
we perform it on the Schoenberg piece. Example 21a shows the results using one dimension -pitch
alone. The first pass segments the pitches into segments of three to six pitches; that is, the
seg- mentation is determined by the sequence of the sizes of suc- cessive unordered pitch
intervals. The segmental boundaries In essence, the algorithm looks at a string of intervals
derived from the successive values in some musical dimension in a piece of music. The string
might be a series of pitch intervals, time intervals (delays), dynamic changes, and so forth.
More than one string can be selected for analysis. Then the algorithm combines the values of
each dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ P

olansky algorithm: For four successive dimension values labeled A through D forming three
successive unordered intervals labeled X, Y, and Z, if the middle interval is greater than the
other two intervals, the string of values is segmented in half; the value C starts a new
segment or phrase. In its simplest form, using only one musical dimension, the algorithm works
by going through the dimension's list of un- directed intervals in threes looking for maximum
values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then average the values in each of the output segments to get a series of new
higher- order values. We input these into the algorithm to produce a second-order segmentation
and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More than one string can be selected for analysis.
Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected

-1086-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- or

der values. We input these into the algorithm to produce a second-order segmentation and so
forth, until the music is parsed into a single segment. To illustrate the Tenney/Polansky
Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results using one
dimension -pitch alone. The first pass segments the pitches into segments of three to six
pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc- cessive
unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at a string
of intervals derived from the successive values in some musical dimension in a piece of music.
The string might be a series of pitch intervals, time intervals (delays), dynamic changes, and
so forth. More than one string can be selected for analysis. Then the algorithm combines the
values of each dimension's suc- cessive intervals according to a user-specified average which
assigns a relative "weight" to each of the dimensions. Example 20 illustrates the principle of
the Tenney/ Polansky algorithm: For four successive dimension values labeled A through D
forming three successive unordered intervals labeled X, Y, and Z, if the middle interval is
greater than the other two intervals, the string of values is segmented in half; the value C
starts a new segment or phrase. In its simplest form, using only one musical dimension, the
algorithm works by going through the dimension's list of un- directed intervals in threes
looking for maximum values and segmenting accordingly. This results in a series of successive
segments (or phrases). We can then average the values in each of the output segments to get a
series of new higher- order values. We input these into the algorithm to produce a second-order
segmentation and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More

than one string can be selected for analysis. Then the algorithm combines the values of each
dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ Polansky algorithm: For four successive dimension values labeled A through D forming
three successive unordered intervals labeled X, Y, and Z, if the middle interval is greater
than the other two intervals, the string of values is segmented in half; the value C starts a
new segment or phrase. In its simplest form, using only one musical dimension, the algorithm
works by going through the dimension's list of un- directed intervals in threes looking for
maximum values and segmenting accordingly. This results in a series of successive segments (or

-1087-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
phrases). We can then average the values in each of the output segments to get a series of new
higher- order values. We input these into the algorithm to produce a second-order segmentation
and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More than one string can be selected for analysis.
Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimensi

on values labeled A through D forming three successive unordered intervals labeled X, Y, and Z,
if the middle interval is greater than the other two intervals, the string of values is
segmented in half; the value C starts a new segment or phrase. In its simplest form, using only
one musical dimension, the algorithm works by going through the dimension's list of un-
directed intervals in threes looking for maximum values and segmenting accordingly. This
results in a series of successive segments (or phrases). We can then average the values in each
of the output segments to get a series of new higher- order values. We input these into the
algorithm to produce a second-order segmentation and so forth, until the music is parsed into a
single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg
piece. Example 21a shows the results using one dimension -pitch alone. The first pass segments
the pitches into segments of three to six pitches; that is, the seg- mentation is determined by
the sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries
are shown by vertical lines. The results are quite reasonable. For instance, the four pitches
<E, Ft, G, F> in phrase 2 are segmented out of the rest of the measure since they fall in a
lower register from the others. Phrases 4 and 5 seem seg- mented correctly; the first is
divided into two segments, the second into one. And in general, the boundaries of these
first-level segments never contradict our more intuitively de- rived phrase structure. The
second pass works on the aver- ages of the values in each level-1 segment. These averages are
simply the center pitch of the bandwidth (measured in semitones) of each level-1 segment. The
intervals between the series of bandwidths forming level 2 are the input to the second pass of
the algorithm. The resulting second-level seg- mentation divides the piece in half in the
middle of the third phrase, which contradicts our six-phrase structure. That the second-pass
parsing is at variance with our phrase structure is not an embarrassment, for we are taking
pitch intervals as the only criterion for segmentation. Let us ex- amine Example 21b with the
algorithm's taking only time spans between notes as input. Here the unit of time is a
thirty-second note. Once again the first level basically con- forms to our ideas of the phrase
structure, with two excep- tions. Likewise, the second pass partitions the stream of dura-
tions so that it has an exception inherited from level 1; the last phrase is divided in half,
with its first part serving as a conclusion to the second-level segment that starts at phrase
4. Finally, Example 21c shows the algorithm's output using both duration and pitch. The initial
values of the previous examples are simply added together. This time the results get

(1961) The Compositional Matrix. Baldwin, NY: Music Teachers National Assoc.

-1088-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(1962) Tonal Harmony in Concept and Practice (3rd ed., 1979). New York: Holt, Rinehart and
Winston.
(1967) SNOBOL3 Primer: An Introduction to the Computer Programming Language. Cambridge, MA: MIT
Press.
(1973) The Structure of Atonal Music. New Haven: Yale Univ. Press.
(1978) The Harmonic Organization of The Rite of Spring. New Haven: Yale Univ. Press.
(1982) Introduction to Schenkerian Analysis (with Steven E. Gilbert). New York: W. W. Norton.
(1995) The American Popular Ballad of the Golden Era: 1924-1950. Princeton: Princeton Univ.
Press.
(1998) The Atonal Music of Anton Webern. New Haven: Yale Univ. Press.
(2001) Listening to Classic American Popular Songs. New Haven: Yale Univ. Press.
See also[edit]
Forte number
References[edit]
Jump up ^ "In memoriam Allen Forte, music theorist". news.yale.edu. October 17, 2014.
Jump up ^ http://news.yale.edu/2014/10/17/memoriam-allen-forte-music-theorist
Jump up ^ Allen Forte, "Secrets of Melody: Line and Design in the Songs of Cole Porter,"
Musical Quarterly 77/4 (1993), unnumbered note on 644-645.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 8.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 9-10; and Berry, "Our Festschrift for Allen: An Introduction and
Conclusion," in A Music-Theoretical Matrix: Essays in Honor of Allen Forte (Part V), ed. David
Carson Berry, Gamut 6/2 (2013), 3.
Jump up ^ Allen Forte, "A Theory of Set-Complexes for Music," Journal of Music Theory 8/2
(1964): 136-183.
Jump up ^ David Carson Berry, "Theory," sect. 5.iv (Pitch-class set theory), in The Grove
Dictionary of American Music, 2nd edition, ed. Charles Hiroshi Garrett (New York: Oxford
University Press, 2013), 8:175-176.
External links[edit]

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance
ry Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a

-1089-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work

-1090-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a

-1091-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-1092-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-1093-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-1094-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-1095-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-1096-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale

-1097-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not

-1098-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not

-1099-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not

-1100-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-1101-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship

-1102-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note

-1103-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of

-1104-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are

-1105-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are

-1106-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a

-1107-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-1108-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-1109-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave

-1110-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-1111-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-1112-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-1113-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not

-1114-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not

-1115-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not

-1116-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch

-1117-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-1118-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-1119-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of

-1120-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are

-1121-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are

-1122-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and

-1123-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

-1124-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https

-1125-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https

-1126-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-1127-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-1128-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-1129-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch

-1130-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability

-1131-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability

-1132-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability

-1133-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-1134-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-1135-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand

-1136-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar

-1137-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-1138-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-1139-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

-1140-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https

-1141-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https

-1142-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
Bobs Atonal Theory Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

-1143-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto

-1144-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-1145-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-1146-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-1147-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are

-1148-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are

-1149-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are

-1150-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

-1151-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https

-1152-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https

-1153-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a

-1154-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-1155-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-1156-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch

-1157-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

-1158-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability

-1159-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability

-1160-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-1161-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-1162-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-1163-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

-1164-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-1165-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-1166-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-1167-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https

-1168-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https

-1169-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof

-1170-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and

-1171-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).

-1172-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).

-1173-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

-1174-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability

-1175-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability

-1176-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe
Forte is well known for his book The Structure of Atonal Music (1973), which traces many of its
roots to an article of a decade earlier: "A Theory of Set-Complexes for Music" (1964).[6] In
these works, he "applied set-theoretic principles to the analysis of unordered collections of
pitch classes, called pitch-class sets (pc sets). [...] The basic goal of Forte's theory was to
define the various relationships that existed among the relevant sets of a work, so that
contextual coherence could be demonstrated." Although the methodology derived from Fortes work
"has had its detractors ... textbooks on post-tonal analysis now routinely teach it (to varying
degrees)."[7]

Forte published analyses of the works of Webern and Berg and wrote about Schenkerian analysis
and music of the Great American Songbook. A complete, annotated bibliography of his
publications appears in the previously cited article, Berry, "The Twin Legacies of a
Scholar-Teacher." Excluding items only edited by Forte, it lists ten books, sixty-three
articles, and thirty-six other types publications, from 1955 through early 2009

Forte was also the editor of the Journal of Music Theory during an important period in its
development, from volume 4/2 (1960) through 11/1 (1967). His involvement with the journal,
including many biographical details, is addressed in David Carson Berry, "Journal of Music
Theory under Allen Forte's Editorship," Journal of Music Theory 50/1 (2006): 7-23.

-1177-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Honors and Awards[edit]
He has been honored by two Festschriften (homage volumes). The first, in commemoration of his
seventieth birthday, was published in 1997 and edited by his former students James M. Baker,
David W. Beach, and Jonathan W. Bernard (FA12, FA6, and FA11, according to Berry's list). It
was titled Music Theory in Concept and Practice (a title derived from Forte's 1962
undergraduate textbook, Tonal Harmony in Concept and Practice). The second was serialized in
five installments of Gamut: The Journal of the Music Theory Society of the Mid-Atlantic,
between 2009 and 2013. It was edited by Forte's former student David Carson Berry (FA72) and
was titled A Music-Theoretical Matrix: Essays in Honor of Allen Forte (a title derived from
Forte's 1961 monograph, A Compositional Matrix). It included twenty-two articles by Forte's
former doctoral advisees, and three special features: a previously unpublished article by
Forte, on Gershwin songs; a collection of tributes and reminiscences from forty-two of his
former advisees; and an annotated register of his publications and advisees.

Personal life[edit]
Forte was married to the French-born pianist Madeleine (Hsu) Forte, emerita professor of piano
at Boise State University.

Bibliography (Books only)[edit]


(1955) Contemporary Tone-Structures. New York: Bureau of Publications, Columbia Univ. Teachers
College.
note isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

-1178-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-1179-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-1180-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are

-1181-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829

-1182-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829

-1183-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch

-1184-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

-1185-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while

-1186-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while

-1187-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-1188-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-1189-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-1190-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

-1191-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation

-1192-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation

-1193-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation

-1194-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-1195-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-1196-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
Forte is well known for his book The Str

-1197-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Personal life[edit]
Forte was married to the French-born pianist Madeleine (Hsu) Forte, emerita professor of piano
at Boise State University.

Bibliography (Books only)[edit]


(1955) Contemporary Tone-Structures. New York: Bureau of Publications, Columbia Univ. Teachers
College.
(1961) The Compositional Matrix. Baldwin, NY: Music Teachers National Assoc.
(1962) Tonal Harmony in Concept and Practice (3rd ed., 1979). New York: Holt, Rinehart and
Winston.
(1967) SNOBOL3 Primer: An Introduction to the Computer Programming Language. Cambridge, MA: MIT
Press.
(1973) The Structure of Atonal Music. New Haven: Yale Univ. Press.
(1978) The Harmonic Organization of The Rite of Spring. New Haven: Yale Univ. Press.
(1982) Introduction to Schenkerian Analysis (with Steven E. Gilbert). New York: W. W. Norton.
(1995) The American Popular Ballad of the Golden Era: 1924-1950. Princeton: Princeton Univ.
Press.
(1998) The Atonal Music of Anton Webern. New Haven: Yale Univ. Press.
(2001) Listening to Classic American Popular Songs. New Haven: Yale Univ. Press.
See also[edit]
Forte number
References[edit]
Jump up ^ "In memoriam Allen Forte, music theorist". news.yale.edu. October 17, 2014.
Jump up ^ http://news.yale.edu/2014/10/17/memoriam-allen-forte-music-theorist
Jump up ^ Allen Forte, "Secrets of Melody: Line and Design in the Songs of Cole Porter,"
Musical Quarterly 77/4 (1993), unnumbered note on 644-645.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 8.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 9-10; and Berry, "Our Festschrift for Allen: An Introduction and
Conclusion," in A Music-Theoretical Matrix: Essays in Honor of Allen Forte (Part V), ed. David
Carson Berry, Gamut 6/2 (2013), 3.
Jump up ^ Allen Forte, "A Theory of Set-Complexes for Music," Journal of Music Theory 8/2
(1964): 136-183.
Jump up ^ David Carson Berry, "Theory," sect. 5.iv (Pitch-class set theory), in The Grove
Dictionary of American Music, 2nd edition, ed. Charles Hiroshi Garrett (New York: Oxford
University Press, 2013), 8:175-176.
External links[edit]

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance
ry Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many

-1198-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note

-1199-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

-1200-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

httpsructure of Atonal Music (1973), which traces many of its roots to an article of a decade
earlier: "A Theory of Set-Complexes for Music" (1964).[6] In these works, he "applied
set-theoretic principles to the analysis of unordered collections of pitch classes, called
pitch-class sets (pc sets). [...] The basic goal of Forte's theory was to define the various
relationships that existed among the relevant sets of a work, so that contextual coherence
could be demonstrated." Although the methodology derived from Fortes work "has had its
detractors ... textbooks on post-tonal analysis now routinely teach it (to varying degrees)."[7]

Forte published analyses of the works of Webern and Berg and wrote about Schenkerian analysis
and music of the Great American Songbook. A complete, annotated bibliography of his
publications appears in the previously cited article, Berry, "The Twin Legacies of a
Scholar-Teacher." Excluding items only edited by Forte, it lists ten books, sixty-three
articles, and thirty-six other types publications, from 1955 through early 2009

Forte was also the editor of the Journal of Music Theory during an important period in its
development, from volume 4/2 (1960) through 11/1 (1967). His involvement with the journal,
including many biographical details, is addressed in David Carson Berry, "Journal of Music
Theory under Allen Forte's Editorship," Journal of Music Theory 50/1 (2006): 7-23.

Honors and Awards[edit]


He has been honored by two Festschriften (homage volumes). The first, in commemoration of his
seventieth birthday, was published in 1997 and edited by his former students James M. Baker,
David W. Beach, and Jonathan W. Bernard (FA12, FA6, and FA11, according to Berry's list). It
was titled Music Theory in Concept and Practice (a title derived from Forte's 1962
undergraduate textbook, Tonal Harmony in Concept and Practice). The second was serialized in
five installments of Gamut: The Journal of the Music Theory Society of the Mid-Atlantic,
between 2009 and 2013. It was edited by Forte's former student David Carson Berry (FA72) and
was titled A Music-Theoretical Matrix: Essays in Honor of Allen Forte (a title derived from
Forte's 1961 monograph, A Compositional Matrix). It included twenty-two articles by Forte's
former doctoral advisees, and three special features: a previously unpublished article by
Forte, on Gershwin songs; a collection of tributes and reminiscences from forty-two of his
former advisees; and an annotated register of his publications and advisees.
specified average which assigns a relative "weight" to each of the dimensions. Example 20
illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successivead the 00
Kernel Debug Guide
https://samsclass.info/126/proj/p12-kernel-debug-win10.htm
- Installing Debugging Tools for Windows
01 WIN SDK
-- Use the Windows SDK, select only the components you want to install
-- in this case, "debugging tools for windows"
- set up local kernel mode debug:
bcdedit /debug on

-1201-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
bcdedit /dbgsettings local
(need to reset)

02 LiveKD -
https://technet.microsoft.com/en-us/sysinternals/bb897415.aspx
If you install the tools to their default directory of \Program Files\Microsoft\Debugging Tools
for Windows, you can run LiveKD from any directory; otherwise you should copy LiveKD to the
directory in which the tools are installed
170414 - installed on drive d: - copy livekd64.exe to
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64

when prompted for sym dir, enter:


D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
srv*D:\Program Files (x86)\Windows
Kits\10\Debuggers\x64\livekd64_Symbols*http://msdl.microsoft.com/download/symbols

side issue : symbol store


https://www.howtogeek.com/236195/how-to-find-out-which-build-and-version-of-windows-10-you-have/
alt-I -> about
command line:
cmd> systeminfo
cmd> ver
cmd> winver
cmd> wmic os get buildnumber,caption,CSDVersion /format:csv
cmd> wmic os get /value

other sites:
https://blogs.technet.microsoft.com/markrussinovich/2005/08/17/unkillable-processes/
https://www.windows-commandline.com/find-windows-os-version-from-command/
http://stackoverflow.com/questions/30159714/i-want-to-know-what-wmic-os-get-name-version-csdversi
on-command-returns-for-w

Debugger reference->debugger commands->Kernel-mode extension commands


https://msdn.microsoft.com/en-us/library/windows/hardware/ff564717(v=vs.85).aspx
!process

https://msdn.microsoft.com/en-us/library/windows/hardware/ff563812(v=vs.85).aspx
!irp

process explorer: configure symbols


Configure Symbols: on Windows NT and higher, if you want Process Explorer to resolve addresses
for thread start addresses in the threads tab of the process properties dialog and the thread
stack window then configure symbols by first downloading the Debugging Tools for Windows
package from Microsoft's web site and installing it in its default directory. Open the
Configure Symbols dialog and specify the path to the dbghelp.dll that's in the Debugging Tools
directory and have the symbol engine download symbols on demand from Microsoft to a directory
on your disk by entering a symbol server string for the symbol path. For example, to have
symbols download to the c:\symbols directory you would enter this string:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

https://blogs.msdn.microsoft.com/usbcoreblog/2009/10/06/why-doesnt-my-driver-unload/

running as SYSTEM:

-1202-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https://forum.sysinternals.com/system-process-using-all-cpu_topic12233.html

cd C:\Users\dan\Desktop\kernel debug\02 LiveKD\PSTools


psexec -s -i -d "C:\Users\dan\Desktop\kernel debug\02 LiveKD\ProcessExplorer\procexp64.exe
psexec -s -i -d "C:\Users\dan\Downloads\processhacker-2.39-bin\x64\ProcessHacker.exe"
to configure symbols in procexp64 or processhacker:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

cd D:\Program Files (x86)\Windows Kits\10\Debuggers\x64


d:
livekd64 -vsym
y
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
!process 0 1 notmyfault64.exe
!process 0 7 notmyfault64.exe
!irp ffff980e687e8310

lected for analysis.


Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- order values. We input these into
the algorithm to produce a second-order segmentation and so forth, until the music is parsed
into a single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the
Schoenberg piece. Example 21a shows the results using one dimension -pitch alone. The first
pass segments the pitches into segments of three to six pitches; that is, the seg- mentation is
determined by the sequence of the sizes of suc- cessive unordered pitch intervals. The
segmental boundaries In essence, the algorithm looks at a string of intervals derived from the
successive values in some musical dimension in a piece of music. The string might be a series
of pitch intervals, time intervals (dEIn essence, the algorithm looks at a string of intervals
derived from the successive values in some musical dimension in a piece of music. The string
might be a series of pitch intervals, time intervals (delays), dynamic changes, and so forth.
More than one string can be selected for analysis. Then the algorithm combines the values of
each dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ Polansky algorithm: For four successive dimension values labeled A through D forming
three successive unordered intervals labeled X, Y, and Z, if the middle interval is greater
than the other two intervals, the string of values is segmented in half; the value C starts a
new segment or phrase. In its simplest form, using only one musical dimension, the algorithm
works by going through the dimension's list of un- directed intervals in threes looking for
maximum values and segmenting accordingly. This results in a series of successive segments (or

-1203-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
phrases). We can then xample 18. Primes enclosed in rectangles Example 18. Primes enclosed in
rectangles Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in rectangles
Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18.
Primes enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18. Primes
enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in
rectangles
<0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <1 0 3
2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <
131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131
402> < 131 402> < 131 402>

read the 00 Kernel Debug Guide


https://samsclass.info/126/proj/p12-kernel-debug-win10.htm
- Installing Debugging Tools for Windows
01 WIN SDK
-- Use the Windows SDK, select only the components you want to install
-- in this case, "debugging tools for windows"
- set up local kernel mode debug:
bcdedit /debug on
bcdedit /dbgsettings local
(need to reset)

02 LiveKD -
https://technet.microsoft.com/en-us/sysinternals/bb897415.aspx
If you install the tools to their default directory of \Program Files\Microsoft\Debugging Tools
for Windows, you can run LiveKD from any directory; otherwise you should copy LiveKD to the
directory in which the tools are installed
170414 - installed on drive d: - copy livekd64.exe to
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64

when prompted for sym dir, enter:


D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
srv*D:\Program Files (x86)\Windows
Kits\10\Debuggers\x64\livekd64_Symbols*http://msdl.microsoft.com/download/symbols

side issue : symbol store


https://www.howtogeek.com/236195/how-to-find-out-which-build-and-version-of-windows-10-you-have/
alt-I -> about
command line:
cmd> systeminfo
cmd> ver
cmd> winver
cmd> wmic os get buildnumber,caption,CSDVersion /format:csv
cmd> wmic os get /value

other sites:
https://blogs.technet.microsoft.com/markrussinovich/2005/08/17/unkillable-processes/
https://www.windows-commandline.com/find-windows-os-version-from-command/
http://stackoverflow.com/questions/30159714/i-want-to-know-what-wmic-os-get-name-version-csdversi
on-command-returns-for-w

Debugger reference->debugger commands->Kernel-mode extension commands


https://msdn.microsoft.com/en-us/library/windows/hardware/ff564717(v=vs.85).aspx
!process

https://msdn.microsoft.com/en-us/library/windows/hardware/ff563812(v=vs.85).aspx
!irp

process explorer: configure symbols


Configure Symbols: on Windows NT and higher, if you want Process Explorer to resolve addresses
for thread start addresses in the threads tab of the process properties dialog and the thread

-1204-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
stack window then configure symbols by first downloading the Debugging Tools for Windows
package from Microsoft's web site and installing it in its default directory. Open the
Configure Symbols dialog and specify the path to the dbghelp.dll that's in the Debugging Tools
directory and have the symbol engine download symbols on demand from Microsoft to a directory
on your disk by entering a symbol server string for the symbol path. For example, to have
symbols download to the c:\symbols directory you would enter this string:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

https://blogs.msdn.microsoft.com/usbcoreblog/2009/10/06/why-doesnt-my-driver-unload/

running as SYSTEM:
https://forum.sysinternals.com/system-process-using-all-cpu_topic12233.html

cd C:\Users\dan\Desktop\kernel debug\02 LiveKD\PSTools


psexec -s -i -d "C:\Users\dan\Desktop\kernel debug\02 LiveKD\ProcessExplorer\procexp64.exe
psexec -s -i -d "C:\Users\dan\Downloads\processhacker-2.39-bin\x64\ProcessHacker.exe"
to configure symbols in procexp64 or processhacker:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

cd D:\Program Files (x86)\Windows Kits\10\Debuggers\x64


d:
livekd64 -vsym
y
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
!process 0 1 notmyfault64.exe
!process 0 7 notmyfault64.exe
!irp ffff980e687e8310

In essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- order values. We input these into
the algorithm to produce a second-order segmentation and so forth, until the music is parsed
into a single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the
Schoenberg piece. Example 21a shows the results using one dimension -pitch alone. The first
pass segments the pitches into segments of three to six pitches; that is, the seg- mentation is
determined by the sequence of the sizes of suc- cessive unordered pitch intervals. The
segmental boundaries In essence, the algorithm looks at a string of intervals derived from the
successive values in some musical dimension in a piece of music. The string might be a series
of pitch intervals, time intervals (delays), dynamic changes, and so forth. More than one
string can be selected for analysis. Then the algorithm combines the values of each dimension's
suc- cessive intervals according to a user-specified average which assigns a relative "weight"
to each of the dimensions. Example 20 illustrates the principle of the Tenney/ Polansky
algorithm: For four successive dimension values labeled A through D forming three successive
unordered intervals labeled X, Y, and Z, if the middle interval is greater than the other two
intervals, the string of values is segmented in half; the value C starts a new segment or
phrase. In its simplest form, using only one musical dimension, the algorithm works by going
through the dimension's list of un- directed intervals in threes looking for maximum values and

-1205-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
segmenting accordingly. This results in a series of successive segments (or phrases). We can
then average the values in each of the output segments to get a series of new higher- order
values. We input these into the algorithm to produce a second-order segmentation and so forth,
until the music is parsed into a single segment. To illustrate the Tenney/Polansky Algorithm,
we perform it on the Schoenberg piece. Example 21a shows the results using one dimension -pitch
alone. The first pass segments the pitches into segments of three to six pitches; that is, the
seg- mentation is determined by the sequence of the sizes of suc- cessive unordered pitch
intervals. The segmental boundaries In essence, the algorithm looks at a string of intervals
derived from the successive values in some musical dimension in a piece of music. The string
might be a series of pitch intervals, time intervals (delays), dynamic changes, and so forth.
More than one string can be selected for analysis. Then the algorithm combines the values of
each dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ P

olansky algorithm: For four successive dimension values labeled A through D forming three
successive unordered intervals labeled X, Y, and Z, if the middle interval is greater than the
other two intervals, the string of values is segmented in half; the value C starts a new
segment or phrase. In its simplest form, using only one musical dimension, the algorithm works
by going through the dimension's list of un- directed intervals in threes looking for maximum
values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then average the values in each of the output segments to get a series of new
higher- order values. We input these into the algorithm to produce a second-order segmentation
and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More than one string can be selected for analysis.
Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,

-1206-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- or

der values. We input these into the algorithm to produce a second-order segmentation and so
forth, until the music is parsed into a single segment. To illustrate the Tenney/Polansky
Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results using one
dimension -pitch alone. The first pass segments the pitches into segments of three to six
pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc- cessive
unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at a string
of intervals derived from the successive values in some musical dimension in a piece of music.
The string might be a series of pitch intervals, time intervals (delays), dynamic changes, and
so forth. More than one string can be selected for analysis. Then the algorithm combines the
values of each dimension's suc- cessive intervals according to a user-specified average which
assigns a relative "weight" to each of the dimensions. Example 20 illustrates the principle of
the Tenney/ Polansky algorithm: For four successive dimension values labeled A through D
forming three successive unordered intervals labeled X, Y, and Z, if the middle interval is
greater than the other two intervals, the string of values is segmented in half; the value C
starts a new segment or phrase. In its simplest form, using only one musical dimension, the
algorithm works by going through the dimension's list of un- directed intervals in threes
looking for maximum values and segmenting accordingly. This results in a series of successive
segments (or phrases). We can then average the values in each of the output segments to get a
series of new higher- order values. We input these into the algorithm to produce a second-order
segmentation and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More

than one string can be selected for analysis. Then the algorithm combines the values of each
dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ Polansky algorithm: For four successive dimension values labeled A through D forming
three successive unordered intervals labeled X, Y, and Z, if the middle interval is greater
than the other two intervals, the string of values is segmented in half; the value C starts a
new segment or phrase. In its simplest form, using only one musical dimension, the algorithm
works by going through the dimension's list of un- directed intervals in threes looking for

-1207-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
maximum values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then average the values in each of the output segments to get a series of new
higher- order values. We input these into the algorithm to produce a second-order segmentation
and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More than one string can be selected for analysis.
Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimensi

on values labeled A through D forming three successive unordered intervals labeled X, Y, and Z,
if the middle interval is greater than the other two intervals, the string of values is
segmented in half; the value C starts a new segment or phrase. In its simplest form, using only
one musical dimension, the algorithm works by going through the dimension's list of un-
directed intervals in threes looking for maximum values and segmenting accordingly. This
results in a series of successive segments (or phrases). We can then average the values in each
of the output segments to get a series of new higher- order values. We input these into the
algorithm to produce a second-order segmentation and so forth, until the music is parsed into a
single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg
piece. Example 21a shows the results using one dimension -pitch alone. The first pass segments
the pitches into segments of three to six pitches; that is, the seg- mentation is determined by
the sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries
are shown by vertical lines. The results are quite reasonable. For instance, the four pitches
<E, Ft, G, F> in phrase 2 are segmented out of the rest of the measure since they fall in a
lower register from the others. Phrases 4 and 5 seem seg- mented correctly; the first is
divided into two segments, the second into one. And in general, the boundaries of these
first-level segments never contradict our more intuitively de- rived phrase structure. The
second pass works on the aver- ages of the values in each level-1 segment. These averages are
simply the center pitch of the bandwidth (measured in semitones) of each level-1 segment. The
intervals between the series of bandwidths forming level 2 are the input to the second pass of
the algorithm. The resulting second-level seg- mentation divides the piece in half in the
middle of the third phrase, which contradicts our six-phrase structure. That the second-pass
parsing is at variance with our phrase structure is not an embarrassment, for we are taking
pitch intervals as the only criterion for segmentation. Let us ex- amine Example 21b with the
algorithm's taking only time spans between notes as input. Here the unit of time is a
thirty-second note. Once again the first level basically con- forms to our ideas of the phrase
structure, with two excep- tions. Likewise, the second pass partitions the stream of dura-
tions so that it has an exception inherited from level 1; the last phrase is divided in half,
with its first part serving as a conclusion to the second-level segment that starts at phrase
4. Finally, Example 21c shows the algorithm's output using both duration and pitch. The initial
values of the previous examples are simply added together. This time the results get

-1208-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(1961) The Compositional Matrix. Baldwin, NY: Music Teachers National Assoc.
(1962) Tonal Harmony in Concept and Practice (3rd ed., 1979). New York: Holt, Rinehart and
Winston.
(1967) SNOBOL3 Primer: An Introduction to the Computer Programming Language. Cambridge, MA: MIT
Press.
(1973) The Structure of Atonal Music. New Haven: Yale Univ. Press.
(1978) The Harmonic Organization of The Rite of Spring. New Haven: Yale Univ. Press.
(1982) Introduction to Schenkerian Analysis (with Steven E. Gilbert). New York: W. W. Norton.
(1995) The American Popular Ballad of the Golden Era: 1924-1950. Princeton: Princeton Univ.
Press.
(1998) The Atonal Music of Anton Webern. New Haven: Yale Univ. Press.
(2001) Listening to Classic American Popular Songs. New Haven: Yale Univ. Press.
See also[edit]
Forte number
References[edit]
Jump up ^ "In memoriam Allen Forte, music theorist". news.yale.edu. October 17, 2014.
Jump up ^ http://news.yale.edu/2014/10/17/memoriam-allen-forte-music-theorist
Jump up ^ Allen Forte, "Secrets of Melody: Line and Design in the Songs of Cole Porter,"
Musical Quarterly 77/4 (1993), unnumbered note on 644-645.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 8.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 9-10; and Berry, "Our Festschrift for Allen: An Introduction and
Conclusion," in A Music-Theoretical Matrix: Essays in Honor of Allen Forte (Part V), ed. David
Carson Berry, Gamut 6/2 (2013), 3.
Jump up ^ Allen Forte, "A Theory of Set-Complexes for Music," Journal of Music Theory 8/2
(1964): 136-183.
Jump up ^ David Carson Berry, "Theory," sect. 5.iv (Pitch-class set theory), in The Grove
Dictionary of American Music, 2nd edition, ed. Charles Hiroshi Garrett (New York: Oxford
University Press, 2013), 8:175-176.
External links[edit]

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance
ry Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

-1209-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
(7) Membership: If a is a member (or element) of the set B, we write a
?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor

-1210-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action

-1211-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a

-1212-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a

-1213-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated

-1214-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated

-1215-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated

-1216-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)

-1217-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the

-1218-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the

-1219-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the

-1220-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-1221-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-1222-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two

-1223-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...

-1224-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time

-1225-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time

-1226-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action

-1227-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a

-1228-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a

-1229-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the

-1230-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-1231-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-1232-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-1233-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the

-1234-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the

-1235-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the

-1236-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions

-1237-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-1238-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-1239-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...

-1240-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time

-1241-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time

-1242-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional

-1243-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

-1244-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-1245-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-1246-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-1247-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-1248-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-1249-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions

-1250-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be

-1251-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be

-1252-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be

-1253-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-1254-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-1255-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith

-1256-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds

-1257-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-1258-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-1259-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

-1260-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-1261-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-1262-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
Bobs Atonal Theory Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n

-1263-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

-1264-
C:\Users\dan\Desktop\junk_scribd.txt Monday, April 17, 2017 7:32 PM
arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmi

You might also like