Professional Documents
Culture Documents
Supplier
Protection
System
Date
Size
:
:
:
:
:
TEAM H2O
COLLECTIVE
[x] Max/MSP
04-01-05
1 * 32 KB
"When I first encountered Max, I thought it was totally headexploding, recalls Booth. "We came up with some pretty
interesting stuff as soon as we got it.
It was almost exactly
what we needed. We initially got it for making MIDI applications,
and it was a way for us to make sequences in which we could
manipulate and generate data on the fly. We could do any
combination of things. For instance, if we wanted to have a snare
sound late, and the bass note as well, we could have the tracks
sync'ed and variables sent across. Before then we had to do this
manually, but with Max we could connect things in a very literal
way. This made it a lot easier to work with drum machines. You
could now jam with them during a live set, and get a pattern to
slide the timing. We began using Max for live work, and then
ended up using it in the studio.
Most of Confield came out of
experiments with Max that weren't really applicable in a club
environment."
Please note: includes os-x versions of thresher by eric
lyon and christopher penrose & chaos-navier-stokes by richard
dudas. coef_bpass3~, comb1~, gain2~ & peq2~
are
from
zack
settel's jimmies (ircam), simpleFM~ is from MSP tutorial 11.
Contents:
Introduction
Where?
When?
Fmkik Sequence
Fmsna Sequence
fmhat/fmhat2 Sequence
fmacid/fmintermit Sequence
fmbasid Sequence
What?
The conclusion
Introduction
This Max patch by Autechre has a relatively simple structure and includes
several abstractions (objects placed in the externals folder) and enclosed
windows (within the main patch). We shall examine the patch by asking the
following questions: Where? When? What precisely?
The answer to the question Where? will give the names of the supporting objects
and the messages which pass through their connecting blocks to create a
working patch. The answer to a question When? will determine the timing of
events in the general sequence which interests us. The answer to the question
What? will allow us to hear the concrete sounds, extracted from the units of the
program at specific moments in time.
The patch responds to two keys: the Space Bar and the T key. The Space Bar
carries out the same actions as the [loadbang] on loading the patch, and the T
key switches on or off the smooth frequency change of sounds contained in the
fmhat sequence through the remote connection [r twisthat (t)] - in one case
where there is an abrupt change in the frequency the program will work out a
jump by a smooth transition from the current value where necessary, while in
other parts it will immediately change the frequency. In practice this does not
always happen. By default the mode of smooth and sharp change of intervals
between sounds in fmhat is established in a condition of gradual change.
Where?
The destination of all the messages and signals of the patch is to the digital-toanalog converter [dac ~], located in the main window of the program - through it
the audio signal is directed to the output of your sound card to your speakers.
The separate audio flows are collected by the [receive~] objects, undergoing
additional amplification by [gain2~]. The audio signal (to the stage of mixing at
[fsAPa]) is the result of 7 different sound sources based on FM-based synthesis:
-
The synthesis of these tools occurs in other windows, located within the main
patch. Commands for the sound reproduction of one or more synthesizers is
transferred through the appropriate receiver ([r kick], [r sna], [r hiout], [r stratehat],
[r aceed], [r intermit] and [r blinblin]. Presets containing different settings for
volume and frequency are stored within the FM synthesizer patches themselves.
The window [aerydm] presents a number of tools: in it are generated commands
for the reproduction of the bass drum sound fmkik, the snare drum fmsna and
hihats fmhat/fmhat2. The melodic part of a track is created in the window [aecyd]
which receives commands from the synthesizers fmbacid, fmacid and fmintermit.
The first message created by the program is a <bang> which begins with a
[loadbang] command. The following things occur on creation of the bang:
-
When?
The commands for the reproduction of the sounds of the FM synthesizers are
generated by the appropriate metronomes upon loading the patch (or by pressing
the Space Bar) with individually timed intervals for each section.
Step by step we will show that the sequence of commands that reproduce the
sounds are created from just one tic of a metronome, expanding to several
events with the aid of numerous delay lines. The sequence of events can be
complex, as they emanate from several different metronomes.
We will conditionally name the sequences of tools as follows:
-
The dependence of some sequences on others is due of the delay lines, which
retain the events of any sequence for a specifically timed interval (more often, a
constant) after which the given event will be transmitted to the synthesizer of
another sequence.
Fmkik Sequence
The primary generator for the kick drum sound [fmkik] is the metronome located
in the aerydm window, which counts ticks with a frequency of 1 time in 16*hub =
1600 milliseconds. With each new tick the command for the reproduction of the
bass drum sound is sent the synthesizer, and through the remote connection [r
mainkik] in the window aerydm/straeki a random number determines:
a)
b)
The delay time (3*hub + 100, + 200 or + 300 ms with equal probability)
before sending a repeated command to reproduce fmkik.
The probability that after the first bass drum 10*hub=1000 milliseconds will
follow one (33,3 %) or two (66,6 %) more: one through 1000, another
through 1000 + 4*hub = 1400 milliseconds. The same probability
determines the delay time between the first bass drum and a sound of the
snare drum fmsna (see below).