You are on page 1of 129

C:\Users\dan\Desktop\junk_scribd.

txt Saturday, May 20, 2017 4:00 PM


Screen reader elative/parallel) That's called modulatory space in terms of tonal centers (a
plane instead of a line). It also works for individual tones.

this is all very related to music psychology - pitch perception. Which has a linear and
non-linear component (phase-locking).

one implication of this is that enharmonics matter - in this approach the enharmonics are not
equivalent. This might not be so intuitive, however octave equivalency plays a big part in
obscuring how this works. Octaves are structure - like the floor and ceiling. Except the
floor and ceiling are the same thing, so you "wrap around" in terms of where the (dark/light)
tritones land in linear space.

lydian and locrain both have a tritone, however the tritone is not symmetrical (in terms of
tonal perception) - enharmonics matter. In lydian it the "tritone above" Locrian is the
"tritone below". These things get mapped into linear pitch space (due to octave equivalency),
but they are not the same tritone. This can be seen on the circle of fifths - F# vs Gb. One
is the leading tone to G, the other is the 4th of Db. These are two sides of a tritone,
completed by C.

however those aren't the only tritones. There's also for example the blues b5. In some
theories, this is coming from a different type of harmony (7-limit).

Also there are about 3 #11's that make sense in a tonal center. When you're dealing with 5th
type movement (in the sharp direction of the circle) it might highlight a specific one of those.

In terms of this "super lydian" scale. Other's have found it. Some ragas use both b2 and #1.

some call thusers, click here to turn off Google Instant...


Google

sheet Einojuhani Rautavaara - Etudes

AllVideosImagesShoppingMapsMore
SettingsTools
Page 7 of about 17,300 results (0.63 seconds)
Search Results
Rautavaara's Rielative/parallel) That's called modulatory space in terms of tonal centers (a
plane instead of a line). It also works for individual tones.

this is all very related to music psychology - pitch perception. Which has a linear and
non-linear component (phase-locking).

one implication of this is that enharmonics matter - in this approach the enharmonics are not
equivalent. This might not be so intuitive, however octave equivalency plays a big part in
obscuring how this works. Octaves are structure - like the floor and ceiling. Except the
floor and ceiling are the same thing, so you "wrap around" in terms of where the (dark/light)
tritones land in linear space.

lydian and locrain both have a tritone, however the tritone is not symmetrical (in terms of
tonal perception) - enharmonics matter. In lydian it the "tritone above" Locrian is the
"tritone below". These things get mapped into linear pitch space (due to octave equivalency),
but they are not the same tritone. This can be seen on the circle of fifths - F# vs Gb. One
is the leading tone to G, the other is the 4th of Db. These are two sides of a tritone,
completed by C.

however those aren't the only tritones. There's also for example the blues b5. In some
theories, this is coming from a different type of harmony (7-limit).

Also there are about 3 #11's that make sense in a tonal center. When you're dealing with 5th
type movement (in the sharp direction of the circle) it might highlight a specific one of those.

In terms of this "super lydian" scale. Other's have found it. Some ragas use both b2 and #1.

some call thverboat - Good-Music-Guide.com

-1-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
www.good-music-guide.com ... The Music Room Composer Discussion
May 1, 2007 - 20 posts - ?7 authors
Rautavaara's Riverboat. ... 2007, 11:03:53 AM . Any composer named Einojuhani deserves a
separate thread . . . . Logged ... His Etudes and Icons are also amazing, and his Piano Sonatas
1 and 2 are wonderful. Narcissus is also ... Anyone know where I could get some of his piano
sheet music? Logged ...
Download link Youtube: Einojuhani Rautavaara - Etudes (1969)
igetlinkyoutube.com/watch?v=nvZ1dzZry1w
Composer: Einojuhani Rautavaara (October 9, 1928 July 27, 2016) Pianist: Laura ... Download
youtube to mp3: Einojuhani Rautavaara - Etudes (1969) ..... to mp3: Hamelin plays Gershwin -
Songbook (18 Songs) Audio + Sheet Music.
99.5 | New Releases - WGBH
www.wgbh.org/995/newandnotablecds.cfm
Visit Augustin Hadelich's site for more information, and to download sheet music for cadenzas
... I have most savored by pianist Mutsuko Uchida features the etudes by Claude Debussy. ...
The Helsinki Philharmonic and Einojuhani Rautavaara
Einojuhani Rautavaara Etudes 1969.mp3 Play online
mp3top.online/play/einojuhani-rautavaara-etudes-1969/nvZ1dzZry1w.html
Einojuhani Rautavaara. Einojuhani Rautavaara - Piano Concerto No 1 (1969).mp3 ... Hamelin plays
Gershwin - Songbook (18 Songs) Audio + Sheet Music.mp3.
Buy Sheet Music VIOLIN - FIDDLE - INSTRUCTIONAL : STUDIES ...
m.buy-scores.com/boutique-search-engine-uk.php?search=&CATEGORIE...
Etude Methodique De La Double Corde Volume 2. Details. Details ... Piano solo [Sheet music]
ABRSM Publishing .... By Einojuhani Rautavaara. For Violin.
Schulhoff - 5 Etudes de Jazz Video Download MP4 3GP FLV - YiFlix ...
www.yiflix.com Music
Mar 24, 2013 - Hamelin plays Gershwin - Songbook (18 Songs) Audio + Sheet Music 28 Jun 1220:02
... Einojuhani Rautavaara - Etudes (1969) 19 Apr 1512: ...
Einojuhani Rautavaara - Etudes (1969)|phim hot nhat
phimhotnhat.net/.../video-einojuhani-rautavaara-etudes-1969.nvZ1...
Translate this page
Composer: Einojuhani Rautavaara (October 9, 1928 July 27, 2016)Pianist: Laura Mikkola00:03
Etude I - Thirds03:21 Etude II - Sevenths04:26 Etude III ...
[PDF]Download pdf file - Modern Accordion Perspectives
www.modernaccordionperspectives.com/Publications_files/MAP2.pdf
Etude II (2009). (Gesualdi). Juan-Jos Mosalini ... Three Etudes (2000). (Olczak). Younghi
Pagh-Paan ... Einojuhani Rautavaara (Finland). Fiddlers (1952-1991).
rautavaara fire sermon pdf - Findeen.com
www.findeen.co.uk Search Directory
... "The Fire Sermon" sheet music - piano sheet music by Einojuhani Rautavaara: ... 2 The Fire
Sermon: Rautavaara: 15: original: pdf: 4 years: 6 Etudes for Piano: ...
John Luther Adams - Nunataks (Solitary Peaks) for Piano (2007 ...
1tvprograma.ru/prosmotr/MnJzM0tuN3lFU2s/
Translate this page
... grandeur, the sudden rise to meet each peak (there are ten) and the slow descent to the
vast ice sheet afterwards. ... Einojuhani Rautavaara - Etudes (1969).
Previous
2
3
4
5
6
7
8
9
10
11
Next
EiScreen reader users, click here to turn off Google Instant...
Google

sheet Einojuhani Rautavaara - Etudes

AllVideosImagesShoppingMapsMore

-2-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
SettingsTools
Page 2 of about 17,300 results (0.63 seconds)
Search Results
Einojuhani Rautavaara, Laura Mikkola - Rautavaara: Works for Piano ...
https://www.amazon.com/Rautavaara-Works-Piano-Sonatas-Etudes/.../B00000JMYG
Rating: 4.4 - ?3 reviews
Einojuhani Rautavaara, Laura Mikkola - Rautavaara: Works for Piano - Piano Sonatas No. 1 & 2;
Icons; Etudes - Amazon.com Music.
Einojuhani Rautavaara - Scribd
https://www.scribd.com/document/336022644/Einojuhani-Rautavaara
Einojuhani Rautavaara ... The etudes were composed in 1969, ...reintroduce a sonorous, broad
piano style using ... Each tude focuses on a particular interval.
Einojuhani Rautavaara - Piano Solo Sheet Music from Presto Classical
www.prestoclassical.co.uk/sm/category1%7CPiano+Solo~composer%7C8865-b
Browse Sheet Music - Composer: Einojuhani Rautavaara, Piano Solo. ... Einojuhani Rautavaara 's
Music For Upright Piano . ... Rautavaara, E: Etudes op. 42.
25 Etudes Melodiques, Op.45 (Heller, Stephen) - IMSLP/Petrucci ...
imslp.org/wiki/25_Etudes_Melodiques,_Op.45_(Heller,_Stephen)
25 Etudes Melodiques, Op.45 (Heller, Stephen) ... Sheet Music. Piano Scores (8); Parts (0);
Arrangements and Transcriptions (0); Other (0) ...
Einojuhani Rautavaara - Classical Archives
www.classicalarchives.com Composers
Einojuhani Rautavaara (composer 1928-) - Play streams in full or download MP3 from Classical
Archives (classicalarchives.com), the largest and best organized ...
Einojuhani Rautavaara: Music For Upright Piano - Piano Instrumental ...
www.musicroom.com ... Piano Solo Post-1900 Instrumental Work
Einojuhani Rautavaara's Music For Upright Piano. ... Media: Sheet Music ... In 1965, when
Einojuhani Rautavaara was thirty-seven years old, he was awarded the prestigious Sibelius
Prize, ... Chopin: Complete Preludes And Etudes 12.95.
Music Finland Core | Einojuhani Rautavaara
https://core.musicfinland.fi/composers/einojuhani-rautavaara
Einojuhani Rautavaara was one of Finland's internationally most successful composers. He made
... Etydit-Etudes, 1969, 8, 00:00, Fennica Gehrman. Fanfaari ...
Rautavaara, Einojuhani - free listen online, download mp3, download ...
classical-music-online.net/en/composer/Rautavaara/1697
Rautavaara, Einojuhani - free listen online, download mp3, download sheet ... Sonata ?1 `Christ
and the Fisherman`. Sonata ?2 `Sermon of Fire`. Etudes.
Boosey and Hawkes Piano Anthology, The ( Pia | J.W. Pepper Sheet ...
https://www.jwpepper.com/Boosey-and-Hawkes-Piano...The/10289615.item
Piano Sheet Music. ... RAG by ELENA KATS-CHERNIN; FANTASIA by BENJAMIN LEES; ETUDE IN A by
BOHUSLAV MARTINU ... BOHUSLAV MARTINU; PASSIONALE by EINOJUHANI RAUTAVAARA; SONG AND DANCE by
NED ROREM ...
Einojuhani Rautavaara - Ondine Records
https://www.ondine.net/?cid=4.2&oid=622
Einojuhani Rautavaara (born 9 October 1928) is internationally one of the best known and most
frequently performed Finnish composers. ... Sibelius selected Rautavaara who spent two years
studying with Vincent Persichetti ... Etudes, Op. 42
Previous
1
2
3
4
5
6
7
8
9
10
Next
Sponsored
Shop for sheet Ein... on Google
Rautavaara: Piano Works by Einojuhani Rautavaara - Piano Sheet Music
Rautavaara: Piano Works by Einojuhani Rautavaara - Piano Sheet Music
$12.99
Sheet Music Plus
More on Google

-3-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
92104, San Diego, CA - From your phone (Location History) - Use precise location - Learn more
Help Send feedback Privacy Termsnojuhani Rautavaara was the leading Finnish composer of his
generation * His late style combined modernism with mystical romanticism * Series of orchestral
works inspired by metaphysical and religious subjects * Immensely popular recordings on Ondine
label, including best-selling Symphony No.7 (Angel of Light) (1995) * Operas on creative and
historic themes including Vincent (1986-87) and Rasputin (2001-03) * Widely performed choral
works including Vigilia (1971-72, rev.1996) * Works written for leading orchestras on both
sides of Atlantic
read the 00 Kernel Debug Guide
https://samsclass.info/126/proj/p12-kernel-debug-win10.htm
- Installing Debugging Tools for Windows
01 WIN SDK
-- Use the Windows SDK, select only the components you want to install
-- in this case, "debugging tools for windows"
- set up local kernel mode debug:
bcdedit /debug on
bcdedit /dbgsettings local pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In general,
pitch perception theories can be divided into place coding and temporal coding. Place theory
holds that the perception of pitch is determined by the place of maximum excitation on the
basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.
Screen reader users, click here to turn off Google Instant...
Google

sheet Einojuhani Rautavaara - Etudes

AllVideosImagesShoppingMapsMore
SettingsTools
Page 2 of about 17,300 results (0.63 seconds)
Search Results
Einojuhani Rautavaara, Laura Mikkola - Rautavaara: Works for Piano ...
https://www.amazon.com/Rautavaara-Works-Piano-Sonatas-Etudes/.../B00000JMYG
Rating: 4.4 - ?3 reviews
Einojuhani Rautavaara, Laura Mikkola - Rautavaara: Works for Piano - Piano Sonatas No. 1 & 2;
Icons; Etudes - Amazon.com Music.
Einojuhani Rautavaara - Scribd
https://www.scribd.com/document/336022644/Einojuhani-Rautavaara
Einojuhani Rautavaara ... The etudes were composed in 1969, ...reintroduce a sonorous, broad
piano style using ... Each tude focuses on a particular interval.
Einojuhani Rautavaara - Piano Solo Sheet Music from Presto Classical
www.prestoclassical.co.uk/sm/category1%7CPiano+Solo~composer%7C8865-b
Browse Sheet Music - Composer: Einojuhani Rautavaara, Piano Solo. ... Einojuhani Rautavaara 's
Music For Upright Piano . ... Rautavaara, E: Etudes op. 42.
25 Etudes Melodiques, Op.45 (Heller, Stephen) - IMSLP/Petrucci ...
imslp.org/wiki/25_Etudes_Melodiques,_Op.45_(Heller,_Stephen)
25 Etudes Melodiques, Op.45 (Heller, Stephen) ... Sheet Music. Piano Scores (8); Parts (0);
Arrangements and Transcriptions (0); Other (0) ...
Einojuhani Rautavaara - Classical Archives
www.classicalarchives.com Composers
Einojuhani Rautavaara (composer 1928-) - Play streams in full or download MP3 from Classical
Archives (classicalarchives.com), the largest and best organized ...
Einojuhani Rautavaara: Music For Upright Piano - Piano Instrumental ...
www.musicroom.com ... Piano Solo Post-1900 Instrumental Work
Einojuhani Rautavaara's Music For Upright Piano. ... Media: Sheet Music ... In 1965, when
Einojuhani Rautavaara was thirty-seven years old, he was awarded the prestigious Sibelius

-4-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
Prize, ... Chopin: Complete Preludes And Etudes 12.95.
Music Finland Core | Einojuhani Rautavaara
https://core.musicfinland.fi/composers/einojuhani-rautavaara
Einojuhani Rautavaara was one of Finland's internationally most successful composers. He made
... Etydit-Etudes, 1969, 8, 00:00, Fennica Gehrman. Fanfaari ...
Rautavaara, Einojuhani - free listen online, download mp3, download ...
classical-music-online.net/en/composer/Rautavaara/1697
Rautavaara, Einojuhani - free listen online, download mp3, download sheet ... Sonata ?1 `Christ
and the Fisherman`. Sonata ?2 `Sermon of Fire`. Etudes.
Boosey and Hawkes Piano Anthology, The ( Pia | J.W. Pepper Sheet ...
https://www.jwpepper.com/Boosey-and-Hawkes-Piano...The/10289615.item
Piano Sheet Music. ... RAG by ELENA KATS-CHERNIN; FANTASIA by BENJAMIN LEES; ETUDE IN A by
BOHUSLAV MARTINU ... BOHUSLAV MARTINU; PASSIONALE by EINOJUHANI RAUTAVAARA; SONG AND DANCE by
NED ROREM ...
Einojuhani Rautavaara - Ondine Records
https://www.ondine.net/?cid=4.2&oid=622
Einojuhani Rautavaara (born 9 October 1928) is internationally one of the best known and most
frequently performed Finnish composers. ... Sibelius selected Rautavaara who spent two years
studying with Vincent Persichetti ... Etudes, Op. 42
Previous
1
2
3
4
5
6
7
8
9
10
Next
Sponsored
Shop for sheet Ein... on Google
Rautavaara: Piano Works by Einojuhani Rautavaara - Piano Sheet Music
Rautavaara: Piano Works by Einojuhani Rautavaara - Piano Sheet Music
$12.99
Sheet Music Plus
More on Google
92104, San Diego, CA - From your phone (Location History) - Use precise location - Learn more
Help Send feedback Privacy Terms

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of th
(need to reset)

02 LiveKD -
https://technet.microsoft.com/en-us/sysinternals/bb897415.aspx
If you install the tools to their default directory of \Program Files\Microsoft\Debugging Tools
for Windows, you can run LiveKD from any directory; otherwise you should copy LiveKD to the
directory in which the tools are installed
170414 - installed on drive d: - copy livekd64.exe to
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64

-5-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
when prompted for sym dir, enter:
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
srv*D:\Program Files (x86)\Windows
Kits\10\Debuggers\x64\livekd64_Symbols*http://msdl.microsoft.com/download/symbols

side issue : symbol store


https://www.howtogeek.com/236195/how-to-find pitch perception try to explain how the physical
sound and specific physiology of the auditory system work together to yield the experience of
pitch. In general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place of maximum
excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of th-out-which-build-and-version-of-windows-10-you-have/
alt-I -> about
command line:
cmd> systeminfo
cmd> ver
cmd> winver
cmd> wmic os get buildnumber,caption,CSDVersion /format:csv
cmd> wmic os get /value

other sites:
https://blogs.technet.microsoft.com/markrussinovich/2005/08/17/unkillable-processes/
https://www.windows-commandline.com/find-windows-os-version-from-command/
http://stackoverflow.com/questions/30159714/i-want-to-know-what-wmic-os-get-name-version-csdversi
on-command-returns-for-w

Debugger reference->debugger commands->Kernel-mode extension commands


https://msdn.microsoft.com/en-us/library/windows/hardware/ff564717(v=vs.85).aspx
!process

https://msdn.microsoft.com/en-us/library/windows/hardware/ff563812(v=vs.85).aspx
!irp

process explorer: configure symbols


Configure Symbols: on Windows NT and higher, if you want Process Explorer to resolve addresses
for thread start addresses in the threads tab of the process properties dialog and the thread
stack window then configure symbols by first downloading the Debugging Tools for Windows
package from Microsoft's web site and installing it in its default directory. Open the

-6-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
Configure Symbols dialog and specify the path to the dbghelp.dll that's in the Debugging Tools
directory and have the symbol engine download symbols on demand from Microsoft to a directory
on your disk by entering a symbol server string for the symbol path. For example, to have
symbols download to the c:\symbols directory you would enter this string:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

https://blogs.msdn.microsoft.com/usbcoreblog/2009/10/06/why-doesnt-my-driver-unload/

running as SYSTEM:
https://forum.sysinternals.com/system-process-using-all-cpu_topic12233.html

cd C:\Users\dan\Desktop\kernel debug\02 LiveKD\PSTools


psexec -s -i -d "C:\Users\dan\Desktop\kernel debug\02 LiveKD\ProcessExplorer\procexp64.exe
psexec -s -i -d "C:\Users\dan\Downloads\processhacker-2.39-bin\x64\ProcessHacker.exe"
to configure symbols in procexp64 or processhacker:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

cd D:\Program Files (x86)\Windows Kits\10\Debuggers\x64


d:
livekd64 -vsym
y
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
!process 0 1 notmyfault64.exe
!process 0 7 notmyfault64.exe
!irp ffff980e687e8310

lected for analysis.


Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- order values. We input these into
the algorithm to produce a second-order segmentation and so forth, until the music is parsed
into a single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the

-7-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
Schoenberg piece. Example 21a shows the results using one dimension -pitch alone. The first
pass segments the pitches into segments of three to six pitches; that is, the seg- mentation is
determined by the sequence of the sizes of suc- cessive unordered pitch intervals. The
segmental boundaries In essence, the algorithm looks at a string of intervals derived from the
successive values in some musical dimension in a piece of music. The string might be a series
of pitch intervals, time intervals (dEIn essence, the algorithm looks at a string of intervals
derived from the successive values in some musical dimension in a piece of music. The string
might be a series of pitch intervals, time intervals (delays), dynamic changes, and so forth.
More than one string can be selected for analysis. Then the algorithm combines the values of
each dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ Polansky algorithm: For four successive dimension values labeled A through D forming
three successive unordered intervals labeled X, Y, and Z, if the middle interval is greater
than the other two intervals, the string of values is segmented in half; the value C starts a
new segment or phrase. In its simplest form, using only one musical dimension, the algorithm
works by going through the dimension's list of un- directed intervals in threes looking for
maximum values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then xample 18. Primes enclosed in rectangles Example 18. Primes enclosed in
rectangles Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in rectangles
Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18.
Primes enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18. Primes
enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18. Primes enclosed in
rectangles
<0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <1 0 3
2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <
131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131
402> < 131 402> < 131 402>

read the 00 Kernel Debug Guide


https://samsclass.info/126/proj/p12-kernel-debug-win10.htm
- Installing Debugging Tools for Windows
01 WIN SDK
-- Use the Windows SDK, select only the components you want to install
-- in this case, "debugging tools for windows"
- set up local kernel mode debug:
bcdedit /debug on
bcdedit /dbgsettings local
(need to reset)

02 LiveKD -
https://technet.microsoft.com/en-us/sysinternals/bb897415.aspx
If you install the tools to their default directory of \Program Files\Microsoft\Debugging Tools
for Windows, you can run LiveKD from any directory; otherwise you should copy LiveKD to the
directory in which the tools are installed
170414 - installed on drive d: - copy livekd64.exe to
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64

when prompted for sym dir, enter:


D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
srv*D:\Program Files (x86)\Windows
Kits\10\Debuggers\x64\livekd64_Symbols*http://msdl.microsoft.com/download/symbols

side issue : symbol store


https://www.howtogeek.com/236195/how-to-find-out-which-build-and-version-of-windows-10-you-have/
alt-I -> about
command line:
cmd> systeminfo
cmd> ver
cmd> winver
cmd> wmic os get buildnumber,caption,CSDVersion /format:csv
cmd> wmic os get /value

other sites:
https://blogs.technet.microsoft.com/markrussinovich/2005/08/17/unkillable-processes/

-8-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
https://www.windows-commandline.com/find-windows-os-version-from-command/
http://stackoverflow.com/questions/30159714/i-want-to-know-what-wmic-os-get-name-version-csdversi
on-command-returns-for-w

Debugger reference->debugger commands->Kernel-mode extension commands


https://msdn.microsoft.com/en-us/library/windows/hardware/ff564717(v=vs.85).aspx
!process

https://msdn.microsoft.com/en-us/library/windows/hardware/ff563812(v=vs.85).aspx
!irp

process explorer: configure symbols


Configure Symbols: on Windows NT and higher, if you want Process Explorer to resolve addresses
for thread start addresses in the threads tab of the process properties dialog and the thread
stack window then configure symbols by first downloading the Debugging Tools for Windows
package from Microsoft's web site and installing it in its default directory. Open the
Configure Symbols dialog and specify the path to the dbghelp.dll that's in the Debugging Tools
directory and have the symbol engine download symbols on demand from Microsoft to a directory
on your disk by entering a symbol server string for the symbol path. For example, to have
symbols download to the c:\symbols directory you would enter this string:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

https://blogs.msdn.microsoft.com/usbcoreblog/2009/10/06/why-doesnt-my-driver-unload/

running as SYSTEM:
https://forum.sysinternals.com/system-process-using-all-cpu_topic12233.html

cd C:\Users\dan\Desktop\kernel debug\02 LiveKD\PSTools


psexec -s -i -d "C:\Users\dan\Desktop\kernel debug\02 LiveKD\ProcessExplorer\procexp64.exe
psexec -s -i -d "C:\Users\dan\Downloads\processhacker-2.39-bin\x64\ProcessHacker.exe"
to configure symbols in procexp64 or processhacker:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

cd D:\Program Files (x86)\Windows Kits\10\Debuggers\x64


d:
livekd64 -vsym
y
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
!process 0 1 notmyfault64.exe
!process 0 7 notmyfault64.exe
!irp ffff980e687e8310

In essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in

-9-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
each of the output segments to get a series of new higher- order values. We input these into
the algorithm to produce a second-order segmentation and so forth, until the music is parsed
into a single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the
Schoenberg piece. Example 21a shows the results using one dimension -pitch alone. The first
pass segments the pitches into segments of three to six pitches; that is, the seg- mentation is
determined by the sequence of the sizes of suc- cessive unordered pitch intervals. The
segmental boundaries In essence, the algorithm looks at a string of intervals derived from the
successive values in some musical dimension in a piece of music. The string might be a series
of pitch intervals, time intervals (delays), dynamic changes, and so forth. More than one
string can be selected for analysis. Then the algorithm combines the values of each dimension's
suc- cessive intervals according to a user-specified average which assigns a relative "weight"
to each of the dimensions. Example 20 illustrates the principle of the Tenney/ Polansky
algorithm: For four successive dimension values labeled A through D forming three successive
unordered intervals labeled X, Y, and Z, if the middle interval is greater than the other two
intervals, the string of values is segmented in half; the value C starts a new segment or
phrase. In its simplest form, using only one musical dimension, the algorithm works by going
through the dimension's list of un- directed intervals in threes looking for maximum values and
segmenting accordingly. This results in a series of successive segments (or phrases). We can
then average the values in each of the output segments to get a series of new higher- order
values. We input these into the algorithm to produce a second-order segmentation and so forth,
until the music is parsed into a single segment. To illustrate the Tenney/Polansky Algorithm,
we perform it on the Schoenberg piece. Example 21a shows the results using one dimension -pitch
alone. The first pass segments the pitches into segments of three to six pitches; that is, the
seg- mentation is determined by the sequence of the sizes of suc- cessive unordered pitch
intervals. The segmental boundaries In essence, the algorithm looks at a string of intervals
derived from the successive values in some musical dimension in a piece of music. The string
might be a series of pitch intervals, time intervals (delays), dynamic changes, and so forth.
More than one string can be selected for analysis. Then the algorithm combines the values of
each dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ P

olansky algorithm: For four successive dimension values labeled A through D forming three
successive unordered intervals labeled X, Y, and Z, if the middle interval is greater than the
other two intervals, the string of values is segmented in half; the value C starts a new
segment or phrase. In its simplest form, using only one musical dimension, the algorithm works
by going through the dimension's list of un- directed intervals in threes looking for maximum
values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then average the values in each of the output segments to get a series of new
higher- order values. We input these into the algorithm to produce a second-order segmentation
and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More than one string can be selected for analysis.

-10-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X, Y, and Z, if
the middle interval is greater than the other two intervals, the string of values is segmented
in half; the value C starts a new segment or phrase. In its simplest form, using only one
musical dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This results in a
series of successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the algorithm
to produce a second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece.
Example 21a shows the results using one dimension -pitch alone. The first pass segments the
pitches into segments of three to six pitches; that is, the seg- mentation is determined by the
sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successive values in
some musical dimension in a piece of music. The string might be a series of pitch intervals,
time intervals (delays), dynamic changes, and so forth. More than one string can be selected
for analysis. Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of the
dimensions. Example 20 illustrates the principle of the Tenney/ Polansky algorithm: For four
successive dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its simplest
form, using only one musical dimension, the algorithm works by going through the dimension's
list of un- directed intervals in threes looking for maximum values and segmenting accordingly.
This results in a series of successive segments (or phrases). We can then average the values in
each of the output segments to get a series of new higher- or

der values. We input these into the algorithm to produce a second-order segmentation and so
forth, until the music is parsed into a single segment. To illustrate the Tenney/Polansky
Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results using one
dimension -pitch alone. The first pass segments the pitches into segments of three to six
pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc- cessive
unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at a string
of intervals derived from the successive values in some musical dimension in a piece of music.
The string might be a series of pitch intervals, time intervals (delays), dynamic changes, and
so forth. More than one string can be selected for analysis. Then the algorithm combines the
values of each dimension's suc- cessive intervals according to a user-specified average which
assigns a relative "weight" to each of the dimensions. Example 20 illustrates the principle of
the Tenney/ Polansky algorithm: For four successive dimension values labeled A through D
forming three successive unordered intervals labeled X, Y, and Z, if the middle interval is
greater than the other two intervals, the string of values is segmented in half; the value C
starts a new segment or phrase. In its simplest form, using only one musical dimension, the
algorithm works by going through the dimension's list of un- directed intervals in threes
looking for maximum values and segmenting accordingly. This results in a series of successive
segments (or phrases). We can then average the values in each of the output segments to get a
series of new higher- order values. We input these into the algorithm to produce a second-order
segmentation and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a

-11-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More

than one string can be selected for analysis. Then the algorithm combines the values of each
dimension's suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of the
Tenney/ Polansky algorithm: For four successive dimension values labeled A through D forming
three successive unordered intervals labeled X, Y, and Z, if the middle interval is greater
than the other two intervals, the string of values is segmented in half; the value C starts a
new segment or phrase. In its simplest form, using only one musical dimension, the algorithm
works by going through the dimension's list of un- directed intervals in threes looking for
maximum values and segmenting accordingly. This results in a series of successive segments (or
phrases). We can then average the values in each of the output segments to get a series of new
higher- order values. We input these into the algorithm to produce a second-order segmentation
and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows the results
using one dimension -pitch alone. The first pass segments the pitches into segments of three to
six pitches; that is, the seg- mentation is determined by the sequence of the sizes of suc-
cessive unordered pitch intervals. The segmental boundaries In essence, the algorithm looks at
a string of intervals derived from the successive values in some musical dimension in a piece
of music. The string might be a series of pitch intervals, time intervals (delays), dynamic
changes, and so forth. More than one string can be selected for analysis. Then the algorithm
combines the values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20 illustrates the
principle of the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the middle
interval is greater than the other two intervals, the string of values is segmented in half;
the value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed intervals
in threes looking for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the output segments
to get a series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single segment. To
illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a
shows the results using one dimension -pitch alone. The first pass segments the pitches into
segments of three to six pitches; that is, the seg- mentation is determined by the sequence of
the sizes of suc- cessive unordered pitch intervals. The segmental boundaries In essence, the
algorithm looks at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time intervals
(delays), dynamic changes, and so forth. More than one string can be selected for analysis.
Then the algorithm combines the values of each dimension's suc- cessive intervals according to
a user-specified average which assigns a relative "weight" to each of the dimensions. Example
20 illustrates the principle of the Tenney/ Polansky algorithm: For four successive dimensi

on values labeled A through D forming three successive unordered intervals labeled X, Y, and Z,
if the middle interval is greater than the other two intervals, the string of values is
segmented in half; the value C starts a new segment or phrase. In its simplest form, using only
one musical dimension, the algorithm works by going through the dimension's list of un-
directed intervals in threes looking for maximum values and segmenting accordingly. This
results in a series of successive segments (or phrases). We can then average the values in each
of the output segments to get a series of new higher- order values. We input these into the
algorithm to produce a second-order segmentation and so forth, until the music is parsed into a
single segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the Schoenberg
piece. Example 21a shows the results using one dimension -pitch alone. The first pass segments
the pitches into segments of three to six pitches; that is, the seg- mentation is determined by
the sequence of the sizes of suc- cessive unordered pitch intervals. The segmental boundaries
are shown by vertical lines. The results are quite reasonable. For instance, the four pitches
<E, Ft, G, F> in phrase 2 are segmented out of the rest of the measure since they fall in a
lower register from the others. Phrases 4 and 5 seem seg- mented correctly; the first is
divided into two segments, the second into one. And in general, the boundaries of these

-12-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
first-level segments never contradict our more intuitively de- rived phrase structure. The
second pass works on the aver- ages of the values in each level-1 segment. These averages are
simply the center pitch of the bandwidth (measured in semitones) of each level-1 segment. The
intervals between the series of bandwidths forming level 2 are the input to the second pass of
the algorithm. The resulting second-level seg- mentation divides the piece in half in the
middle of the third phrase, which contradicts our six-phrase structure. That the second-pass
parsing is at variance with our phrase structure is not an embarrassment, for we are taking
pitch intervals as the only criterion for segmentation. Let us ex- amine Example 21b with the
algorithm's taking only time spans between notes as input. Here the unit of time is a
thirty-second note. Once again the first level basically con- forms to our ideas of the phrase
structure, with two excep- tions. Likewise, the second pass partitions the stream of dura-
tions so that it has an exception inherited from level 1; the last phrase is divided in half,
with its first part serving as a conclusion to the second-level segment that starts at phrase
4. Finally, Example 21c shows the algorithm's output using both duration and pitch. The initial
values of the previous examples are simply added together. This time the results get

(1961) The Compositional Matrix. Baldwin, NY: Music Teachers National Assoc.
(1962) Tonal Harmony in Concept and Practice (3rd ed., 1979). New York: Holt, Rinehart and
Winston.
(1967) SNOBOL3 Primer: An Introduction to the Computer Programming Language. Cambridge, MA: MIT
Press.
(1973) The Structure of Atonal Music. New Haven: Yale Univ. Press.
(1978) The Harmonic Organization of The Rite of Spring. New Haven: Yale Univ. Press.
(1982) Introduction to Schenkerian Analysis (with Steven E. Gilbert). New York: W. W. Norton.
(1995) The American Popular Ballad of the Golden Era: 1924-1950. Princeton: Princeton Univ.
Press.
(1998) The Atonal Music of Anton Webern. New Haven: Yale Univ. Press.
(2001) Listening to Classic American Popular Songs. New Haven: Yale Univ. Press.
See also[edit]
Forte number
References[edit]
Jump up ^ "In memoriam Allen Forte, music theorist". news.yale.edu. October 17, 2014.
Jump up ^ http://news.yale.edu/2014/10/17/memoriam-allen-forte-music-theorist
Jump up ^ Allen Forte, "Secrets of Melody: Line and Design in the Songs of Cole Porter,"
Musical Quarterly 77/4 (1993), unnumbered note on 644-645.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 8.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 9-10; and Berry, "Our Festschrift for Allen: An Introduction and
Conclusion," in A Music-Theoretical Matrix: Essays in Honor of Allen Forte (Part V), ed. David
Carson Berry, Gamut 6/2 (2013), 3.
Jump up ^ Allen Forte, "A Theory of Set-Complexes for Music," Journal of Music Theory 8/2
(1964): 136-183.
Jump up ^ David Carson Berry, "Theory," sect. 5.iv (Pitch-class set theory), in The Grove
Dictionary of American Music, 2nd edition, ed. Charles Hiroshi Garrett (New York: Oxford
University Press, 2013), 8:175-176.
External links[edit]

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance
ry Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called

-13-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-14-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a

-15-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-16-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-17-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for

-18-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not

-19-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not

-20-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto

-21-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-22-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-23-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after

-24-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are

-25-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are

-26-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsScreen reader users, click here to turn off Google Instant...


Google

sheet Einojuhani Rautavaara - Etudes

AllVideosImagesShoppingMapsMore
SettingsTools
Page 7 of about 17,300 results (0.63 seconds)
Search Results
Rautavaara's Riverboat - Good-Music-Guide.com
www.good-music-guide.com ... The Music Room Composer Discussion
May 1, 2007 - 20 posts - ?7 authors
Rautavaara's Riverboat. ... 2007, 11:03:53 AM . Any composer named Einojuhani deserves a
separate thread . . . . Logged ... His Etudes and Icons are also amazing, and his Piano Sonatas
1 and 2 are wonderful. Narcissus is also ... Anyone know where I could get some of his piano
sheet music? Logged ...
Download link Youtube: Einojuhani Rautavaara - Etudes (1969)

-27-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
igetlinkyoutube.com/watch?v=nvZ1dzZry1w
Composer: Einojuhani Rautavaara (October 9, 1928 July 27, 2016) Pianist: Laura ... Download
youtube to mp3: Einojuhani Rautavaara - Etudes (1969) ..... to mp3: Hamelin plays Gershwin -
Songbook (18 Songs) Audio + Sheet Music.
99.5 | New Releases - WGBH
www.wgbh.org/995/newandnotablecds.cfm
Visit Augustin Hadelich's site for more information, and to download sheet music for cadenzas
... I have most savored by pianist Mutsuko Uchida features the etudes by Claude Debussy. ...
The Helsinki Philharmonic and Einojuhani Rautavaara
Einojuhani Rautavaara Etudes 1969.mp3 Play online
mp3top.online/play/einojuhani-rautavaara-etudes-1969/nvZ1dzZry1w.html
Einojuhani Rautavaara. Einojuhani Rautavaara - Piano Concerto No 1 (1969).mp3 ... Hamelin plays
Gershwin - Songbook (18 Songs) Audio + Sheet Music.mp3.
Buy Sheet Music VIOLIN - FIDDLE - INSTRUCTIONAL : STUDIES ...
m.buy-scores.com/boutique-search-engine-uk.php?search=&CATEGORIE...
Etude Methodique De La Double Corde Volume 2. Details. Details ... Piano solo [Sheet music]
ABRSM Publishing .... By Einojuhani Rautavaara. For Violin.
Schulhoff - 5 Etudes de Jazz Video Download MP4 3GP FLV - YiFlix ...
www.yiflix.com Music
Mar 24, 2013 - Hamelin plays Gershwin - Songbook (18 Songs) Audio + Sheet Music 28 Jun 1220:02
... Einojuhani Rautavaara - Etudes (1969) 19 Apr 1512: ...
Einojuhani Rautavaara - Etudes (1969)|phim hot nhat
phimhotnhat.net/.../video-einojuhani-rautavaara-etudes-1969.nvZ1...
Translate this page
Composer: Einojuhani Rautavaara (October 9, 1928 July 27, 2016)Pianist: Laura Mikkola00:03
Etude I - Thirds03:21 Etude II - Sevenths04:26 Etude III ...
[PDF]Download pdf file - Modern Accordion Perspectives
www.modernaccordionperspectives.com/Publications_files/MAP2.pdf
Etude II (2009). (Gesualdi). Juan-Jos Mosalini ... Three Etudes (2000). (Olczak). Younghi
Pagh-Paan ... Einojuhani Rautavaara (Finland). Fiddlers (1952-1991).
rautavaara fire sermon pdf - Findeen.com
www.findeen.co.uk Search Directory
... "The Fire Sermon" sheet music - piano sheet music by Einojuhani Rautavaara: ... 2 The Fire
Sermon: Rautavaara: 15: original: pdf: 4 years: 6 Etudes for Piano: ...
John Luther Adams - Nunataks (Solitary Peaks) for Piano (2007 ...
1tvprograma.ru/prosmotr/MnJzM0tuN3lFU2s/
Translate this page
... grandeur, the sudden rise to meet each peak (there are ten) and the slow descent to the
vast ice sheet afterwards. ... Einojuhani Rautavaara - Etudes (1969).
Previous
2
3
4
5
6
7
8
9
10
11
Next
Screen reader users, click here to turn off Google Instant...
Google

sheet Einojuhani Rautavaara - Etudes

AllVideosImagesShoppingMapsMore
SettingsTools
Page 7 of about 17,300 results (0.63 seconds)
Search Results
Rautavaara's Riverboat - Good-Music-Guide.com
www.good-music-guide.com ... The Music Room Composer Discussion
May 1, 2007 - 20 posts - ?7 authors
Rautavaara's Riverboat. ... 2007, 11:03:53 AM . Any composer named Einojuhani deserves a

-28-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
separate thread . . . . Logged ... His Etudes and Icons are also amazing, and his Piano Sonatas
1 and 2 are wonderful. Narcissus is also ... Anyone know where I could get some of his piano
sheet music? Logged ...
Download link Youtube: Einojuhani Rautavaara - Etudes (1969)
igetlinkyoutube.com/watch?v=nvZ1dzZry1w
Composer: Einojuhani Rautavaara (October 9, 1928 July 27, 2016) Pianist: Laura ... Download
youtube to mp3: Einojuhani Rautavaara - Etudes (1969) ..... to mp3: Hamelin plays Gershwin -
Songbook (18 Songs) Audio + Sheet Music.
99.5 | New Releases - WGBH
www.wgbh.org/995/newandnotablecds.cfm
Visit Augustin Hadelich's site for more information, and to download sheet music for cadenzas
... I have most savored by pianist Mutsuko Uchida features the etudes by Claude Debussy. ...
The Helsinki Philharmonic and Einojuhani Rautavaara
Einojuhani Rautavaara Etudes 1969.mp3 Play online
mp3top.online/play/einojuhani-rautavaara-etudes-1969/nvZ1dzZry1w.html
Einojuhani Rautavaara. Einojuhani Rautavaara - Piano Concerto No 1 (1969).mp3 ... Hamelin plays
Gershwin - Songbook (18 Songs) Audio + Sheet Music.mp3.
Buy Sheet Music VIOLIN - FIDDLE - INSTRUCTIONAL : STUDIES ...
m.buy-scores.com/boutique-search-engine-uk.php?search=&CATEGORIE...
Etude Methodique De La Double Corde Volume 2. Details. Details ... Piano solo [Sheet music]
ABRSM Publishing .... By Einojuhani Rautavaara. For Violin.
Schulhoff - 5 Etudes de Jazz Video Download MP4 3GP FLV - YiFlix ...
www.yiflix.com Music
Mar 24, 2013 - Hamelin plays Gershwin - Songbook (18 Songs) Audio + Sheet Music 28 Jun 1220:02
... Einojuhani Rautavaara - Etudes (1969) 19 Apr 1512: ...
Einojuhani Rautavaara - Etudes (1969)|phim hot nhat
phimhotnhat.net/.../video-einojuhani-rautavaara-etudes-1969.nvZ1...
Translate this page
Composer: Einojuhani Rautavaara (October 9, 1928 July 27, 2016)Pianist: Laura Mikkola00:03
Etude I - Thirds03:21 Etude II - Sevenths04:26 Etude III ...
[PDF]Download pdf file - Modern Accordion Perspectives
www.modernaccordionperspectives.com/Publications_files/MAP2.pdf
Etude II (2009). (Gesualdi). Juan-Jos Mosalini ... Three Etudes (2000). (Olczak). Younghi
Pagh-Paan ... Einojuhani Rautavaara (Finland). Fiddlers (1952-1991).
rautavaara fire sermon pdf - Findeen.com
www.findeen.co.uk Search Directory
... "The Fire Sermon" sheet music - piano sheet music by Einojuhani Rautavaara: ... 2 The Fire
Sermon: Rautavaara: 15: original: pdf: 4 years: 6 Etudes for Piano: ...
John Luther Adams - Nunataks (Solitary Peaks) for Piano (2007 ...
1tvprograma.ru/prosmotr/MnJzM0tuN3lFU2s/
Translate this page
... grandeur, the sudden rise to meet each peak (there are ten) and the slow descent to the
vast ice sheet afterwards. ... Einojuhani Rautavaara - Etudes (1969).
Previous
2
3
4
5
6
7
8
9
10
11
Next
Screen reader users, click here to turn off Google Instant...
Google

sheet Einojuhani Rautavaara - Etudes

AllVideosImagesShoppingMapsMore
SettingsTools
Page 7 of about 17,300 results (0.63 seconds)
Search Results

-29-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
Rautavaara's Riverboat - Good-Music-Guide.com
www.good-music-guide.com ... The Music Room Composer Discussion
May 1, 2007 - 20 posts - ?7 authors
Rautavaara's Riverboat. ... 2007, 11:03:53 AM . Any composer named Einojuhani deserves a
separate thread . . . . Logged ... His Etudes and Icons are also amazing, and his Piano Sonatas
1 and 2 are wonderful. Narcissus is also ... Anyone know where I could get some of his piano
sheet music? Logged ...
Download link Youtube: Einojuhani Rautavaara - Etudes (1969)
igetlinkyoutube.com/watch?v=nvZ1dzZry1w
Composer: Einojuhani Rautavaara (October 9, 1928 July 27, 2016) Pianist: Laura ... Download
youtube to mp3: Einojuhani Rautavaara - Etudes (1969) ..... to mp3: Hamelin plays Gershwin -
Songbook (18 Songs) Audio + Sheet Music.
99.5 | New Releases - WGBH
www.wgbh.org/995/newandnotablecds.cfm
Visit Augustin Hadelich's site for more information, and to download sheet music for cadenzas
... I have most savored by pianist Mutsuko Uchida features the etudes by Claude Debussy. ...
The Helsinki Philharmonic and Einojuhani Rautavaara
Einojuhani Rautavaara Etudes 1969.mp3 Play online
mp3top.online/play/einojuhani-rautavaara-etudes-1969/nvZ1dzZry1w.html
Einojuhani Rautavaara. Einojuhani Rautavaara - Piano Concerto No 1 (1969).mp3 ... Hamelin plays
Gershwin - Songbook (18 Songs) Audio + Sheet Music.mp3.
Buy Sheet Music VIOLIN - FIDDLE - INSTRUCTIONAL : STUDIES ...
m.buy-scores.com/boutique-search-engine-uk.php?search=&CATEGORIE...
Etude Methodique De La Double Corde Volume 2. Details. Details ... Piano solo [Sheet music]
ABRSM Publishing .... By Einojuhani Rautavaara. For Violin.
Schulhoff - 5 Etudes de Jazz Video Download MP4 3GP FLV - YiFlix ...
www.yiflix.com Music
Mar 24, 2013 - Hamelin plays Gershwin - Songbook (18 Songs) Audio + Sheet Music 28 Jun 1220:02
... Einojuhani Rautavaara - Etudes (1969) 19 Apr 1512: ...
Einojuhani Rautavaara - Etudes (1969)|phim hot nhat
phimhotnhat.net/.../video-einojuhani-rautavaara-etudes-1969.nvZ1...
Translate this page
Composer: Einojuhani Rautavaara (October 9, 1928 July 27, 2016)Pianist: Laura Mikkola00:03
Etude I - Thirds03:21 Etude II - Sevenths04:26 Etude III ...
[PDF]Download pdf file - Modern Accordion Perspectives
www.modernaccordionperspectives.com/Publications_files/MAP2.pdf
Etude II (2009). (Gesualdi). Juan-Jos Mosalini ... Three Etudes (2000). (Olczak). Younghi
Pagh-Paan ... Einojuhani Rautavaara (Finland). Fiddlers (1952-1991).
rautavaara fire sermon pdf - Findeen.com
www.findeen.co.uk Search Directory
... "The Fire Sermon" sheet music - piano sheet music by Einojuhani Rautavaara: ... 2 The Fire
Sermon: Rautavaara: 15: original: pdf: 4 years: 6 Etudes for Piano: ...
John Luther Adams - Nunataks (Solitary Peaks) for Piano (2007 ...
1tvprograma.ru/prosmotr/MnJzM0tuN3lFU2s/
Translate this page
... grandeur, the sudden rise to meet each peak (there are ten) and the slow descent to the
vast ice sheet afterwards. ... Einojuhani Rautavaara - Etudes (1969).
Previous
2
3
4
5
6
7
8
9
10
11
Next
Screen reader users, click here to turn off Google Instant...
Google

sheet Einojuhani Rautavaara - Etudes

-30-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
AllVideosImagesShoppingMapsMore
SettingsTools
Page 7 of about 17,300 results (0.63 seconds)
Search Results
Rautavaara's Riverboat - Good-Music-Guide.com
www.good-music-guide.com ... The Music Room Composer Discussion
May 1, 2007 - 20 posts - ?7 authors
Rautavaara's Riverboat. ... 2007, 11:03:53 AM . Any composer named Einojuhani deserves a
separate thread . . . . Logged ... His Etudes and Icons are also amazing, and his Piano Sonatas
1 and 2 are wonderful. Narcissus is also ... Anyone know where I could get some of his piano
sheet music? Logged ...
Download link Youtube: Einojuhani Rautavaara - Etudes (1969)
igetlinkyoutube.com/watch?v=nvZ1dzZry1w
Composer: Einojuhani Rautavaara (October 9, 1928 July 27, 2016) Pianist: Laura ... Download
youtube to mp3: Einojuhani Rautavaara - Etudes (1969) ..... to mp3: Hamelin plays Gershwin -
Songbook (18 Songs) Audio + Sheet Music.
99.5 | New Releases - WGBH
www.wgbh.org/995/newandnotablecds.cfm
Visit Augustin Hadelich's site for more information, and to download sheet music for cadenzas
... I have most savored by pianist Mutsuko Uchida features the etudes by Claude Debussy. ...
The Helsinki Philharmonic and Einojuhani Rautavaara
Einojuhani Rautavaara Etudes 1969.mp3 Play online
mp3top.online/play/einojuhani-rautavaara-etudes-1969/nvZ1dzZry1w.html
Einojuhani Rautavaara. Einojuhani Rautavaara - Piano Concerto No 1 (1969).mp3 ... Hamelin plays
Gershwin - Songbook (18 Songs) Audio + Sheet Music.mp3.
Buy Sheet Music VIOLIN - FIDDLE - INSTRUCTIONAL : STUDIES ...
m.buy-scores.com/boutique-search-engine-uk.php?search=&CATEGORIE...
Etude Methodique De La Double Corde Volume 2. Details. Details ... Piano solo [Sheet music]
ABRSM Publishing .... By Einojuhani Rautavaara. For Violin.
Schulhoff - 5 Etudes de Jazz Video Download MP4 3GP FLV - YiFlix ...
www.yiflix.com Music
Mar 24, 2013 - Hamelin plays Gershwin - Songbook (18 Songs) Audio + Sheet Music 28 Jun 1220:02
... Einojuhani Rautavaara - Etudes (1969) 19 Apr 1512: ...
Einojuhani Rautavaara - Etudes (1969)|phim hot nhat
phimhotnhat.net/.../video-einojuhani-rautavaara-etudes-1969.nvZ1...
Translate this page
Composer: Einojuhani Rautavaara (October 9, 1928 July 27, 2016)Pianist: Laura Mikkola00:03
Etude I - Thirds03:21 Etude II - Sevenths04:26 Etude III ...
[PDF]Download pdf file - Modern Accordion Perspectives
www.modernaccordionperspectives.com/Publications_files/MAP2.pdf
Etude II (2009). (Gesualdi). Juan-Jos Mosalini ... Three Etudes (2000). (Olczak). Younghi
Pagh-Paan ... Einojuhani Rautavaara (Finland). Fiddlers (1952-1991).
rautavaara fire sermon pdf - Findeen.com
www.findeen.co.uk Search Directory
... "The Fire Sermon" sheet music - piano sheet music by Einojuhani Rautavaara: ... 2 The Fire
Sermon: Rautavaara: 15: original: pdf: 4 years: 6 Etudes for Piano: ...
John Luther Adams - Nunataks (Solitary Peaks) for Piano (2007 ...
1tvprograma.ru/prosmotr/MnJzM0tuN3lFU2s/
Translate this page
... grandeur, the sudden rise to meet each peak (there are ten) and the slow descent to the
vast ice sheet afterwards. ... Einojuhani Rautavaara - Etudes (1969).
Previous
2
3
4
5
6
7
8
9
10
11
Next

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be

-31-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.

-32-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known

-33-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known

-34-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

-35-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while

-36-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while

-37-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created

-38-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-39-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-40-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-41-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation

-42-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation

-43-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation

-44-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...

-45-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-46-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-47-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.

-48-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known

-49-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known

-50-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for

-51-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

-52-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-53-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-54-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-55-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-56-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-57-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...

-58-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the

-59-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the

-60-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the

-61-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-62-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of thEinojuhani Rautavaara was the
leading Finnish composer of his generation * His late style combined modernism with mystical
romanticism * Series of orchestral works inspired by metaphysical and religious subjects *
Immensely popular recordings on Ondine label, including best-selling Symphony No.7 (Angel of
Light) (1995) * Operas on creative and historic themes including Vincent (1986-87) and Rasputin
(2001-03) * Widely performed choral works including Vigilia (1971-72, rev.1996) * Works written
for leading orchestras on both sides of Atlantic
re complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the

-63-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

-64-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto
thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-65-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-66-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

-67-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.
Screen reader users, click here to turn off Google Instant...
Google

sheet Einojuhani Rautavaara - Etudes

AllVideosImagesShoppingMapsMore
SettingsTools
Page 7 of about 17,300 results (0.63 seconds)
Search Results
Rautavaara's Riverboat - Good-Music-Guide.com
www.good-music-guide.com ... The Music Room Composer Discussion
May 1, 2007 - 20 posts - ?7 authors
Rautavaara's Riverboat. ... 2007, 11:03:53 AM . Any composer named Einojuhani deserves a
separate thread . . . . Logged ... His Etudes and Icons are also amazing, and his Piano Sonatas
1 and 2 are wonderful. Narcissus is also ... Anyone know where I could get some of his piano
sheet music? Logged ...
Download link Youtube: Einojuhani Rautavaara - Etudes (1969)

-68-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
igetlinkyoutube.com/watch?v=nvZ1dzZry1w
Composer: Einojuhani Rautavaara (October 9, 1928 July 27, 2016) Pianist: Laura ... Download
youtube to mp3: Einojuhani Rautavaara - Etudes (1969) ..... to mp3: Hamelin plays Gershwin -
Songbook (18 Songs) Audio + Sheet Music.
99.5 | New Releases - WGBH
www.wgbh.org/995/newandnotablecds.cfm
Visit Augustin Hadelich's site for more information, and to download sheet music for cadenzas
... I have most savored by pianist Mutsuko Uchida features the etudes by Claude Debussy. ...
The Helsinki Philharmonic and Einojuhani Rautavaara
Einojuhani Rautavaara Etudes 1969.mp3 Play online
mp3top.online/play/einojuhani-rautavaara-etudes-1969/nvZ1dzZry1w.html
Einojuhani Rautavaara. Einojuhani Rautavaara - Piano Concerto No 1 (1969).mp3 ... Hamelin plays
Gershwin - Songbook (18 Songs) Audio + Sheet Music.mp3.
Buy Sheet Music VIOLIN - FIDDLE - INSTRUCTIONAL : STUDIES ...
m.buy-scores.com/boutique-search-engine-uk.php?search=&CATEGORIE...
Etude Methodique De La Double Corde Volume 2. Details. Details ... Piano solo [Sheet music]
ABRSM Publishing .... By Einojuhani Rautavaara. For Violin.
Schulhoff - 5 Etudes de Jazz Video Download MP4 3GP FLV - YiFlix ...
www.yiflix.com Music
Mar 24, 2013 - Hamelin plays Gershwin - Songbook (18 Songs) Audio + Sheet Music 28 Jun 1220:02
... Einojuhani Rautavaara - Etudes (1969) 19 Apr 1512: ...
Einojuhani Rautavaara - Etudes (1969)|phim hot nhat
phimhotnhat.net/.../video-einojuhani-rautavaara-etudes-1969.nvZ1...
Translate this page
Composer: Einojuhani Rautavaara (October 9, 1928 July 27, 2016)Pianist: Laura Mikkola00:03
Etude I - Thirds03:21 Etude II - Sevenths04:26 Etude III ...
[PDF]Download pdf file - Modern Accordion Perspectives
www.modernaccordionperspectives.com/Publications_files/MAP2.pdf
Etude II (2009). (Gesualdi). Juan-Jos Mosalini ... Three Etudes (2000). (Olczak). Younghi
Pagh-Paan ... Einojuhani Rautavaara (Finland). Fiddlers (1952-1991).
rautavaara fire sermon pdf - Findeen.com
www.findeen.co.uk Search Directory
... "The Fire Sermon" sheet music - piano sheet music by Einojuhani Rautavaara: ... 2 The Fire
Sermon: Rautavaara: 15: original: pdf: 4 years: 6 Etudes for Piano: ...
John Luther Adams - Nunataks (Solitary Peaks) for Piano (2007 ...
1tvprograma.ru/prosmotr/MnJzM0tuN3lFU2s/
Translate this page
... grandeur, the sudden rise to meet each peak (there are ten) and the slow descent to the
vast ice sheet afterwards. ... Einojuhani Rautavaara - Etudes (1969).
Previous
2
3
4
5
6
7
8
9
10
11
Next

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-69-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-70-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
Bobs Atonal Theory Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

-71-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
.(12) B is the complement of A, if B contains all elements of U not in A. We show the
complement of A by A
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike

-72-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-73-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-74-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-75-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-76-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-77-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

-78-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

-79-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-80-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a

-81-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are

-82-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829

-83-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829

-84-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch

-85-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

-86-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while

-87-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while

-88-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-89-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-90-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual

-91-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,Einojuhani Rautavaara was the leading Finnish composer of his generation * His late
style combined modernism with mystical romanticism * Series of orchestral works inspired by
metaphysical and religious subjects * Immensely popular recordings on Ondine label, including
best-selling Symphony No.7 (Angel of Light) (1995) * Operas on creative and historic themes
including Vincent (1986-87) and Rasputin (2001-03) * Widely performed choral works including
Vigilia (1971-72, rev.1996) * Works written for leading orchestras on both sides of Atlantic
ly mean that most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories ofe functional network is based on evaluating the similarity of the dynamics of the
oscillators through the computation of a statistical similarity measure (SSM). In this work we
used three SSMs, namely the absolute value of the cross correlation (also known as Pearsons

-92-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
coefficient) CC, the mutual information MI and the mutual information of the time series
ordinal patterns MIOP25. The former is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time

-93-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time

-94-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a

-95-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a

-96-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a

-97-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

-98-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-99-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-100-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

-101-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the

-102-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the

-103-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe
Forte is well known for his book The Structure of Atonal Music (1973), which traces many of its
roots to an article of a decade earlier: "A Theory of Set-Complexes for Music" (1964).[6] In
these works, he "applied set-theoretic principles to the analysis of unordered collections of
pitch classes, called pitch-class sets (pc sets). [...] The basic goal of Forte's theory was to
define the various relationships that existed among the relevant sets of a work, so that
contextual coherence could be demonstrated." Although the methodology derived from Fortes work
"has had its detractors ... textbooks on post-tonal analysis now routinely teach it (to varying
degrees)."[7]

Forte published analyses of the works of Webern and Berg and wrote about Schenkerian analysis
and music of the Great American Songbook. A complete, annotated bibliography of his
publications appears in the previously cited article, Berry, "The Twin Legacies of a
Scholar-Teacher." Excluding items only edited by Forte, it lists ten books, sixty-three
articles, and thirty-six other types publications, from 1955 through early 2009

Forte was also the editor of the Journal of Music Theory during an important period in its

-104-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
development, from volume 4/2 (1960) through 11/1 (1967). His involvement with the journal,
including many biographical details, is addressed in David Carson Berry, "Journal of Music
Theory under Allen Forte's Editorship," Journal of Music Theory 50/1 (2006): 7-23.

Honors and Awards[edit]


He has been honored by two Festschriften (homage volumes). The first, in commemoration of his
seventieth birthday, was published in 1997 and edited by his former students James M. Baker,
David W. Beach, and Jonathan W. Bernard (FA12, FA6, and FA11, according to Berry's list). It
was titled Music Theory in Concept and Practice (a title derived from Forte's 1962
undergraduate textbook, Tonal Harmony in Concept and Practice). The second was serialized in
five installments of Gamut: The Journal of the Music Theory Society of the Mid-Atlantic,
between 2009 and 2013. It was edited by Forte's former student David Carson Berry (FA72) and
was titled A Music-Theoretical Matrix: Essays in Honor of Allen Forte (a title derived from
Forte's 1961 monograph, A Compositional Matrix). It included twenty-two articles by Forte's
former doctoral advisees, and three special features: a previously unpublished article by
Forte, on Gershwin songs; a collection of tributes and reminiscences from forty-two of his
former advisees; and an annotated register of his publications and advisees.

Personal life[edit]
Forte was married to the French-born pianist Madeleine (Hsu) Forte, emerita professor of piano
at Boise State University.

Bibliography (Books only)[edit]


(1955) Contemporary Tone-Structures. New York: Bureau of Publications, Columbia Univ. Teachers
College.
note isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that

-105-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-106-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-107-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

-108-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-109-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

-110-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They

-111-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.
Screen reader users, click here to turn off Google Instant...
Google

sheet Einojuhani Rautavaara - Etudes

AllVideosImagesShoppingMapsMore
SettingsTools
Page 7 of about 17,300 results (0.63 seconds)
Search Results
Rautavaara's Riverboat - Good-Music-Guide.com
www.good-music-guide.com ... The Music Room Composer Discussion
May 1, 2007 - 20 posts - ?7 authors
Rautavaara's Riverboat. ... 2007, 11:03:53 AM . Any composer named Einojuhani deserves a
separate thread . . . . Logged ... His Etudes and Icons are also amazing, and his Piano Sonatas
1 and 2 are wonderful. Narcissus is also ... Anyone know where I could get some of his piano
sheet music? Logged ...
Download link Youtube: Einojuhani Rautavaara - Etudes (1969)
igetlinkyoutube.com/watch?v=nvZ1dzZry1w
Composer: Einojuhani Rautavaara (October 9, 1928 July 27, 2016) Pianist: Laura ... Download
youtube to mp3: Einojuhani Rautavaara - Etudes (1969) ..... to mp3: Hamelin plays Gershwin -
Songbook (18 Songs) Audio + Sheet Music.
99.5 | New Releases - WGBH
www.wgbh.org/995/newandnotablecds.cfm
Visit Augustin Hadelich's site for more information, and to download sheet music for cadenzas

-112-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
... I have most savored by pianist Mutsuko Uchida features the etudes by Claude Debussy. ...
The Helsinki Philharmonic and Einojuhani Rautavaara
Einojuhani Rautavaara Etudes 1969.mp3 Play online
mp3top.online/play/einojuhani-rautavaara-etudes-1969/nvZ1dzZry1w.html
Einojuhani Rautavaara. Einojuhani Rautavaara - Piano Concerto No 1 (1969).mp3 ... Hamelin plays
Gershwin - Songbook (18 Songs) Audio + Sheet Music.mp3.
Buy Sheet Music VIOLIN - FIDDLE - INSTRUCTIONAL : STUDIES ...
m.buy-scores.com/boutique-search-engine-uk.php?search=&CATEGORIE...
Etude Methodique De La Double Corde Volume 2. Details. Details ... Piano solo [Sheet music]
ABRSM Publishing .... By Einojuhani Rautavaara. For Violin.
Schulhoff - 5 Etudes de Jazz Video Download MP4 3GP FLV - YiFlix ...
www.yiflix.com Music
Mar 24, 2013 - Hamelin plays Gershwin - Songbook (18 Songs) Audio + Sheet Music 28 Jun 1220:02
... Einojuhani Rautavaara - Etudes (1969) 19 Apr 1512: ...
Einojuhani Rautavaara - Etudes (1969)|phim hot nhat
phimhotnhat.net/.../video-einojuhani-rautavaara-etudes-1969.nvZ1...
Translate this page
Composer: Einojuhani Rautavaara (October 9, 1928 July 27, 2016)Pianist: Laura Mikkola00:03
Etude I - Thirds03:21 Etude II - Sevenths04:26 Etude III ...
[PDF]Download pdf file - Modern Accordion Perspectives
www.modernaccordionperspectives.com/Publications_files/MAP2.pdf
Etude II (2009). (Gesualdi). Juan-Jos Mosalini ... Three Etudes (2000). (Olczak). Younghi
Pagh-Paan ... Einojuhani Rautavaara (Finland). Fiddlers (1952-1991).
rautavaara fire sermon pdf - Findeen.com
www.findeen.co.uk Search Directory
... "The Fire Sermon" sheet music - piano sheet music by Einojuhani Rautavaara: ... 2 The Fire
Sermon: Rautavaara: 15: original: pdf: 4 years: 6 Etudes for Piano: ...
John Luther Adams - Nunataks (Solitary Peaks) for Piano (2007 ...
1tvprograma.ru/prosmotr/MnJzM0tuN3lFU2s/
Translate this page
... grandeur, the sudden rise to meet each peak (there are ten) and the slow descent to the
vast ice sheet afterwards. ... Einojuhani Rautavaara - Etudes (1969).
Previous
2
3
4
5
6
7
8
9
10
11
Next
etween-correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the

-113-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the

-114-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the

-115-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-116-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information

-117-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof


a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two

-118-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...

-119-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time

-120-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time

-121-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action

-122-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a

-123-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a

-124-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity measure (SSM).
In this work we used three SSMs, namely the absolute value of the cross correlation (also known
as Pearsons coefficient) CC, the mutual information MI and the mutual information of the time
series ordinal patterns MIOP25. The former is a linear measure and the two latter are
non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
Forte is well known for his book The Structure of Atonal Music (1973), which traces many of its
roots to an article of a decade earlier: "A Theory of Set-Complexes for Music" (1964).[6] In
these works, he "applied set-theoretic principles to the analysis of unordered collections of
pitch classes, called pitch-class sets (pc sets). [...] The basic goal of Forte's theory was to
define the various relationships that existed among the relevant sets of a work, so that
contextual coherence could be demonstrated." Although the methodology derived from Fortes work
"has had its detractors ... textbooks on post-tonal analysis now routinely teach it (to varying
degrees)."[7]

Forte published analyses of the works of Webern and Berg and wrote about Schenkerian analysis
and music of the Great American Songbook. A complete, annotated bibliography of his
publications appears in the previously cited article, Berry, "The Twin Legacies of a
Scholar-Teacher." Excluding items only edited by Forte, it lists ten books, sixty-three
articles, and thirty-six other types publications, from 1955 through early 2009

Forte was also the editor of the Journal of Music Theory during an important period in its
development, from volume 4/2 (1960) through 11/1 (1967). His involvement with the journal,
including many biographical details, is addressed in David Carson Berry, "Journal of Music
Theory under Allen Forte's Editorship," Journal of Music Theory 50/1 (2006): 7-23.

Honors and Awards[edit]


He has been honored by two Festschriften (homage volumes). The first, in commemoration of his
seventieth birthday, was published in 1997 and edited by his former students James M. Baker,
David W. Beach, and Jonathan W. Bernard (FA12, FA6, and FA11, according to Berry's list). It
was titled Music Theory in Concept and Practice (a title derived from Forte's 1962
undergraduate textbook, Tonal Harmony in Concept and Practice). The second was serialized in
five installments of Gamut: The Journal of the Music Theory Society of the Mid-Atlantic,
between 2009 and 2013. It was edited by Forte's former student David Carson Berry (FA72) and
was titled A Music-Theoretical Matrix: Essays in Honor of Allen Forte (a title derived from

-125-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
Forte's 1961 monograph, A Compositional Matrix). It included twenty-two articles by Forte's
former doctoral advisees, and three special features: a previously unpublished article by
Forte, on Gershwin songs; a collection of tributes and reminiscences from forty-two of his
former advisees; and an annotated register of his publications and advisees.

Personal life[edit]
Forte was married to the French-born pianist Madeleine (Hsu) Forte, emerita professor of piano
at Boise State University.

Bibliography (Books only)[edit]


(1955) Contemporary Tone-Structures. New York: Bureau of Publications, Columbia Univ. Teachers
College.
(1961) The Compositional Matrix. Baldwin, NY: Music Teachers National Assoc.
(1962) Tonal Harmony in Concept and Practice (3rd ed., 1979). New York: Holt, Rinehart and
Winston.
(1967) SNOBOL3 Primer: An Introduction to the Computer Programming Language. Cambridge, MA: MIT
Press.
(1973) The Structure of Atonal Music. New Haven: Yale Univ. Press.
(1978) The Harmonic Organization of The Rite of Spring. New Haven: Yale Univ. Press.
(1982) Introduction to Schenkerian Analysis (with Steven E. Gilbert). New York: W. W. Norton.
(1995) The American Popular Ballad of the Golden Era: 1924-1950. Princeton: Princeton Univ.
Press.
(1998) The Atonal Music of Anton Webern. New Haven: Yale Univ. Press.
(2001) Listening to Classic American Popular Songs. New Haven: Yale Univ. Press.
See also[edit]
Forte number
References[edit]
Jump up ^ "In memoriam Allen Forte, music theorist". news.yale.edu. October 17, 2014.
Jump up ^ http://news.yale.edu/2014/10/17/memoriam-allen-forte-music-theorist
Jump up ^ Allen Forte, "Secrets of Melody: Line and Design in the Songs of Cole Porter,"
Musical Quarterly 77/4 (1993), unnumbered note on 644-645.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 8.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes Editorship," Journal
of Music Theory 50/1 (2006), 9-10; and Berry, "Our Festschrift for Allen: An Introduction and
Conclusion," in A Music-Theoretical Matrix: Essays in Honor of Allen Forte (Part V), ed. David
Carson Berry, Gamut 6/2 (2013), 3.
Jump up ^ Allen Forte, "A Theory of Set-Complexes for Music," Journal of Music Theory 8/2
(1964): 136-183.
Jump up ^ David Carson Berry, "Theory," sect. 5.iv (Pitch-class set theory), in The Grove
Dictionary of American Music, 2nd edition, ed. Charles Hiroshi Garrett (New York: Oxford
University Press, 2013), 8:175-176.
External links[edit]

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance
ry Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2) Sets of
pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12 (see
below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches related by any

-126-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6) Pcsets must be
realized (or represented or articulated) by pitches. To realize a pcset in music, it must
be ordered in pitch andin time. Every musical articulation of a pcset produces a contour. Many
different psets may represent one pcset. Pcsets may modelmelodies, harmonies, mixed textures,
etc.Definitions from Finite Set Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their common
elements (written A
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show the


complement of A by A
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In thecompositionof a


cantus firmusinmodalcounterpoint,forexample,aleapislimitedto certainintervalsandmust be
followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-versus-correl
ation
So the two are not antagonisticthey are complementary, describing different aspects of the
association between two random variables. One could comment that Mutual Information "is not
concerned" whether the association is linear or not, while Covariance may be zero and the
variables may still be stochastically dependent. On the other hand, Covariance can be
calculated directly from a data sample without the need to actually know the probability
distributions involved (since it is an expression involving moments of the distribution), while
Mutual Information requires knowledge of the distributions, whose estimation, if unknown, is a
much more delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-correlation-a
nd-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic relationship
(Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y after
observing X. It is the KL distance between the joint density and the product of the individual
densities. So MI can measure non-monotonic relationships and other more complicated
relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof

-127-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What scaledegreesand
harmonies are involved?(Andtheanswers to suchquestionswill of coursedependonwhetherthe note
isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental ornot,andsoforth.Manycomposersandanalystshavesoughtsome
extensionorgeneralizationof tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized inpitch,time,and
other musicaldimensions,usingsome meansof musicalarticulation o maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrument,adynamicl
evel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted nlyto


thosewhopossessthemagic password,for-biddingtechnicalvocabulary bristlingwithexpressionslike
"6-Z44"and"intervalvector."t has thusoftenppearedto the uninitiated s
thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticaliteraturehave
come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid thistheoryomefrom nd how has
itmanagedto become sodominant?ettheorymergednresponseto the motivicand
contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinaryarietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonalmusic is
morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function sprimarystructural
eterminants.nthissituation,newmusictheorywasneeded,freeof traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative positions
on a musical scale based primarily on their perception of the frequency of vibration.[6] Pitch
is closely related to frequency, but the two are not equivalent. Frequency is an objective,
scientific attribute that can be measured. Pitch is each person's subjective perception of a
sound wave, which cannot be directly measured. However, this does not necessarily mean that
most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to obtain a
frequency. It takes a sentient mind to map the internal quality of pitch. However, pitches are
usually associated with, and thus quantified as frequencies in cycles per second, or hertz, by
comparing sounds with pure tones, which have periodic, sinusoidal waveforms. Complex and
aperiodic sound waves can often be assigned a pitch by this method.
...

-128-
C:\Users\dan\Desktop\junk_scribd.txt Saturday, May 20, 2017 4:00 PM
Theories of pitch perception try to explain how the physical sound and specific physiology of
the auditory system work together to yield the experience of pitch. In general, pitch
perception theories can be divided into place coding and temporal coding. Place theory holds
that the perception of pitch is determined by the place of maximum excitation on the basilar
membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in effect for
the perception of high frequencies, since neurons have an upper limit on how fast they can
phase-lock their action potentials.[6] However, a purely place-based theory cannot account for
the accuracy of pitch perception in the low and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of action
potentials, mostly the phase-locking and mode-locking of action potentials to frequencies in a
stimulus.

https

-129-

You might also like