Introduction to Computer Music: Volume One

1. How does the MIDI system work? Page 13

Interactive Performance via MIDI

One of the most innovative applications of MIDI to a real-time performance situation is MAX/MSP, a graphic programming environment originally developed at IRCAM, a center for electronic music research in Paris, by Miller Puckette. MAX, named after Max Mathews, an innovator in computer music research, was originally developed for a Macintosh, but was ported to the IRCAM NeXT-based workstation, and re-released commercially for the Macintosh (now PC also). MAX/MSP is a collection of graphic objects, written in C, that a composer can connect or "patch" together onscreen.

MAX screenshot here

MAX patches usually take real-time MIDI input, process the information according to the composer's specifications and output what could be a very sophisticated set of responses to MIDI instruments. MAX objects simplify the programming tasks of parsing MIDI data and timing events or performing musical and temporal calculations. The user can, if he chooses, also program objects to be added to an already extensive library. Many fascinating pieces, such as Russell Pinkston's Gerrymander (2003) for clarinet and Macintosh, involve the instrumentalist playing through a pitch-tracking module interfaced to a MAX patch. The patch tests for certain pitches, phrases, density of note-events, or even dynamic ranges which then trigger a variety of MIDI and audio events, even processing the instrumental sound itself. Am earlier work by Charles Bestor, Cycles (1992), is a sculpture, sound and lighting installation in which MAX interactively controls both sound and lighting. One of the most recent developments is the extension of MAX from simply a MIDI, or control, processor, to an acoustic processor as well, by the addition of DSP (digital signal processing) objects (Puckette, 1991).

Older interactive software applications, such as Intelligent Music's M (still available from Cycling '74!) allow the performer to program certain boundaries or parameters of response by the computer. MIDI input then triggers an algorithmic set of MIDI-out responses. For example, from the input of a short group of notes, M might repeat the phrase several times with reordered notes, transpositions, rhythms, or dynamic levels. It was envisioned that performers would develop a knowledge of the program's tendencies and a sense of control in shaping the interactions during performance.

Many interactive pieces have been written in which the composer chooses to control only certain parameters of the performance and creates bounds or conditions in which the computer may select certain musical actions. MIDI has been the most readily available source of output for algorithmic composers, such as Gary Nelson. One of Nelson's aims is to tailor extensive computer algorithms to make many of the compositional decisions in real time that he himself would intuitively make and output them to MIDI devices in a performance situation. In compositions such as Fractal Mountains (1989), he improvises with a MIDI wind controller over an algorithmically generated background. In each performance, the computer may made an entirely different set of controlled compositional decisions.

As early as 1970, Mathews and Moore implemented a microcomputer-based GROOVE system to control real-time performance on analog synthesizers (Lincoln, 1972). The computer would record a series of analog voltages from a small keyboard, joystick, and various knobs. The performance playback, through 14 D/A converters, could be altered by editing the disk file. With the advent of MIDI and digital instrument technology, Mathews developed a software/ hardware system that combines the resources of a sequencer and MIDI controller to form the Mathews Radio Baton, also called the Radio Drum. In Mathew's system a score file of MIDI note data can be read and output by his Conductor program, while expressive musical parameters such as tempo, dynamics, timbre, or any of the MIDI controller values can be controlled in real time by the Radio Drum. The controller consists of a flat surface with a complex array of receiving antennae and small radio transmitters placed in the head of drumsticks. By analyzing the signal strengths, the position of each stick in relation to the drum can be determined in three axes. One axis is usually used to "beat" the tempo of playback, while the other five variables can be assigned to key velocity, timbre control, or any other MIDI value in the Conductor program, giving the "conductor" a wide range of expression during performance. Several composers, such as Boulanger, have composed works that involve more than one performer/conductor, each controlling a different bank of instruments.

1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14

| Jacobs School of Music | Center for Electronic and Computer Music | Contact Us | ©2017-18 Prof. Jeffrey Hass