Exercise 6

Exercise 6: External Control of Sound

An expanding sub-field of electronic music is performance on data-driven instruments, in which a musician controls real-time sound production by performing with devices that translate human movement into data streams for use by a computer.

In Project 1 you will design a data-driven instrument, compose a short piece for it, and perform the piece. But before we’re ready for that, we must learn some basic techniques to let us control Max from external devices.

We can use all sorts of external hardware to control sound generation in Max. There are two main aspects of this endeavor:

  1. the mechanics of connecting a device and accessing its data in real time, and
  2. the musical application of the data — how we map the data to the parameters that control sound generation.

An unusual aspect of controlling sound generation using devices that are not recognizable musical instruments is that there is no fixed association between actions taken on a device and the sound produced by them. The freedom this allows can be liberating, but we always want the actions taken on the device to be musically convincing.

In this exercise, we practice the mechanics of connecting devices to Max and making sound using them. While we do this, we will think about which mappings (of data to parameters) make the most musical sense. We will also use continuous control of several parameters.


We’re learning how to...

  • receive data from USB game controllers, smartphones, webcams, and trackpads;
  • use this data to provide continuous parameter controls or to generate triggers;
  • handle sound durations known in advance or only after starting to make the sound.

How to Do This Exercise

Working on the assignment is a two-stage process.

  1. Download Exercise 6 Max Tips. This folder of Max patches shows you how to use external hardware to trigger sound files. Open them in Max in order (part 1, then part 2, etc.), reading the comments and operating the patches.

    The first four parts show how to send data to Max from the computer trackpad (or mouse), USB gamepads, phone apps, and webcams. Not all of these methods will work for every person, but hopefully you will find one that does. We prefer not to use standard keyboard-oriented MIDI controllers for this assignment.

    Even if a particular method won’t work for you, please read the patch about it, so that you understand the basic ideas and any Max objects introduced.

  2. Make a patch that plays sounds (of any kind — sound files or synthetic notes) in response to actions on some kind of controller (such as buttons on a gamepad or phone screen). Also implement some kind of continuous control. It’s possible to generate triggers from continuous data, in case your controller has no buttons. Please consult the Tips files for ideas. Be ready to demonstrate your patch in tutorial by doing something musical with it.

See the Suggestions section below.

Be sure you understand what each of these Max objects does:

  • hi, umenu, route (part 2)
  • udpreceive, OSC-route, unpack (part 3)
  • Vizzie modules — camera, RGB analysis (part 4)
  • delay (del), keyup (part 5)
  • past, int, change, speedlim (part 6)


  • The first step in using external hardware is to connect to it and make sure Max is receiving data. This is fairly straightforward with MIDI but can be challenging with other devices.

    If you have trouble getting a device to work, please let your instructor know right away.

  • Study the Max Tips patches for this week, and set up your Max code to recognize whatever data your hardware device is emitting.
  • Start with something simple, such as playing a sound file in response to a button press or (less simple) a trigger derived from continuous data.

    If you are using the trackpad for continuous x/y coordinates, it will be okay to use computer keys as trigger buttons. Be sure to absorb the technique, described in the part 5 patch, for stopping a sound before it has finished playing.

  • Playing a sound file in response to a button press on a controller requires that you convert an incoming button value into a cue number for playlist~.

    In the case of most HID buttons, no conversion is necessary, as these values are often 1 and 0, and these will start and stop playback of the first (and maybe only) cue in a playlist~.

    When it is necessary to convert the data, you can do this by matching incoming data values using the select object and then sending a bang from the appropriate outlet of select to a message box containing the number 1, which can play a cue in playlist~. This is no different from what we did in Exercise 2.

  • Decide whether you want to control the duration of a sound in real time, so that your action stops, as well as starts, playback. (This is easy for HID buttons that send 1 when pressed and 0 when released.) Or you can treat the button press as a trigger for a fixed duration, and ignore the button release value. A fixed duration can be simply the full length of the sound file, a selected duration within the playlist~ waveform display, or the time difference between the playback start time and a duration given as an argument to a delay object.
  • When using continuous controls available on your chosen device, map the input range to your desired output range using scale.


  • Be sure you satisfied the criteria listed above.
  • Submit your Max patch in Canvas. If your patch requires sound files or other patches, put all files in a folder, and zip the folder before submitting.

    CAUTION: Be sure you test your patch if you move it into a new folder with other files.

Grading Criteria

This exercise is graded pass/fail. You must submit the exercise by Thursday midnight to be eligible for a pass.

Your patch must

  • operate correctly and
  • implement the functionality described in the “How To Do This Exercise” section above.