An expanding sub-field of electronic music is performance on data-driven instruments, in which a musician controls real-time sound production by performing with devices that translate human movement into data streams for use by a computer.
In Project 1 you will design a data-driven instrument, compose a short piece for it, and perform the piece. But before we’re ready for that, we must learn some basic techniques to let us control Max from external devices.
We can use all sorts of external hardware to control sound generation in Max. There are two main aspects of this endeavor:
- the mechanics of connecting a device and accessing its data in real time, and
- the musical application of the data — how we map the data to the parameters that control sound generation.
An unusual aspect of controlling sound generation using devices that are not recognizable musical instruments is that there is no fixed association between actions taken on a device and the sound produced by them. The freedom this allows can be liberating, but we always want the actions taken on the device to be musically convincing.
In this exercise, we practice the mechanics of connecting devices to Max and making sound using them. While we do this, we will think about which mappings (of data to parameters) make the most musical sense. We will also use continuous control of several parameters.