In the last exercise we explored the control of Max using external hardware. This capability leads to the development and performance of data-driven instruments. This term describes a system composed of three parts: an external hardware device that allows for the encoding of human gesture as data, software that maps these data to musical parameters, and a sound-synthesis engine that renders sound as a direct result of human movement.
In this project you will create your own data-driven instrument, develop a short composition, and perform it. The hardware device can be one of the ones discussed in the last exercise. Or, if those prove unworkable for you, it can be a MIDI controller of some kind. But please don’t use a MIDI keyboard as a conventional way to play notes. We want to be more experimental in this project.