A Music Performance Environment for the NeXT Computer

Michael McNabb
Manager, Sound & Music Software
NeXT, Inc., Redwood City, CA, USA


Ensemble, a real-time computer music performance application, is described. Ensemble demonstrates the potential for highly-integrated synthesis, performance, and algorithmic composition software running on a platform with highly-integrated digital audio, signal processing, and MIDI capabilities. Being document-oriented, Ensemble provides almost unlimited flexibility in the structuring and allocation of system resources.


Ensemble is a computer music performance application which operates in real time, provides several levels of functionality, and is software extensible. It demonstrates the possibilities for synthesis, performance, and algorithmic composition in music software running on a platform with highly-integrated digital audio, signal processing, and MIDI capabilities.

In its simplest mode, Ensemble provides a way to control sound synthesis on a DSP56001 directly from MIDI input and the user interface. Alternatively, or together with DSP synthesis, the same MIDI data may be sent back out a serial port to another MIDI device. Sound files may be played back synchronized to MIDI input, or mixed with the DSP output. Other modes of operation include the orchestration of notes from MIDI files or Music Kit score files, interactive note processing, real-time algorithmic composition, and computer-assisted improvisation.

A version of Ensemble was used extensively by the author to realize his 1989 composition The Lark Full Cloud.


Ensemble is implemented on the NeXT Computer, and is written in objective-C. It makes use of object classes and functions from the Application Kit, Music Kit, Sound, and Midi software libraries, and utilizes the resident DSP56001 signal processor and D/A converters for synthesis and sound playback1,2.

The Performance Window

The user interface to Ensemble consists primarily of a top-level panel which controls the performance of scores and algorithms, a hierarchical menu, and one or more document windows. The performance panel controls the playback, stopping, pausing, and recording of scores and algorithms using a set of tape-transport-like buttons (figure 1). The record and playback modes of each part of the score is also controlled by a matrix of buttons. Only one score may be active at a time. The score may be created by reading in a standard MIDI file or Music Kit Score file, or recording MIDI input or the results of an algorithmic performances.

Ensemble also includes a virtual screen Clavier which allows entry of notes with the mouse or keyboard, when a MIDI controller is not available or needed (figure 2).


Ensemble is document-oriented. Each document consists of a window with room for the display of interfaces to four "input" stages and four "instruments" (figure 3). An input stage consists of a chain of one or more instances of a subclass of the Music Kit NoteFilter object. Each chain of note filters may receive Music Kit Note objects from one MIDI channel and one part of a score simultaneously. Notes are passed sequentially down the chain, with each note filter possibly performing some modification to the note or altering some process according to its parameters. Finally, the notes are sent to one or more Music Kit Instrument objects for realization (figure 4). Documents may be saved and loaded via the main menu. Any number of documents may be created. Only the active document is connected to MIDI input, the score, and the DSP or sound resources. However, each document may be assigned a MIDI program change number and channel, so that the active document may be changed during a performance. This allows nearly unlimited flexibility in the use of system resources. If Ensemble ceases to be the active application, it relinquishes control of the DSP unless a performance is in progress, in which case it continues to play in the background.

Note Filters

Note filters are instances of a subclass of the Music Kit NoteFilter class, and are objects which do something based on the input note. For example, it may modify it, analyze it, generate new notes based on it, or throw it away. It can then pass that note and/or its own new note or notes to an instrument or another note filter. Any number of note filters with arbitrary functionality may be chained together between MIDI and score input and the instruments. Each note filter chain may have the same set of note filters as another, or completely different ones. Note filter functionality can range from simple thinning of notes, to complex MIDI processing during a performance, or algorithmic composition with real-time interactive modification. One note filter which determines key number range and transposition is always included for each input, and it's interface appears directly in the document window. Other note filters are instantiated by selecting from a pop-up menu, and their interfaces are displayed in separate panels as required.

Algorithmic generation of notes is accomplished by associating an instance of a subclass of the Performer Music Kit class with a note filter. The note filter observes the note stream for parameters of interest while the performer runs the algorithm, controlled by the same Conductor object which controls the timing of notes for performing Score files. Thus, all compositional algorithms are synchronized to any recorded parts being played back, with tempo changes affecting the entire performance.

Current Ensemble note filters include:


Instruments are instances of a subclass of the Music Kit Instrument class, and are the controlling objects for a Music Kit synthpatch, a MIDI output channel, or sound file playback. The main interface for an instrument is displayed in the lower portion of the document window. This usually includes sliders for panning, level adjustment, and two other parameters. Additional parameters, if any, are accessed via inspector sub-panels (figure 10). The parameters of the instruments of the active document are included when a score is saved to a file, so that voicings of DSP instruments created in Ensemble may be readily incorporated into other Music Kit applications. The total number of DSP voices possible varies with the algorithms selected and the choice of sampling rate.

Current instrument types include:

Example Application

One document configuration used by the author serves as a rich example. Two input stages are used, each with a fractal melody note filter, a harmonics generator note filter, and a data mapper note filter. Each note filter chain is connected both to a MIDI output instrument and a DSP synthesis instrument. The harmonics generator produces a fast-moving series of notes based on the the overtones of the slower fractal melodies. The key numbers and velocities generated are then also mapped to MIDI controllers. The controller data is sent out to an external signal processor and controls the resonant frequencies and level scaling of two resonant filter banks. The notes are also send to the two DSP instruments which play along in parallel. Cross-fading can then be performed between the externally-processed live sound and the synthesized sound. Parameters of the algorithmic processes may also be adjusted via the mouse or external MIDI controllers in real time.


The source code for Ensemble is intended to be included as a programming example in a future release of the NeXT system. Documentation will be included which describes how to create additional instruments and note filters, and link them into a customized version of the program.

About the Author

Composer Michael McNabb holds a Doctor of Musical Arts in Music Composition from Stanford University, where he studied with Leland Smith and John Chowning. He realized his compositions at the Center for Computer Research in Music and Acoustics for over 10 years, and now works at his private digital music studio in San Francisco. He has received awards from the Prix Ars Electronica, the National Endowment for the Arts, the Bourges Festival, and the League of Composers / ISCM. Two CD releases are available, Computer Music (Mobile Fidelity Sound Labs MFCD-818), and Invisible Cities (Wergo 2015-50). Michael McNabb is currently manager of the Sound and Music Group at NeXT Computer. He is an Associate AES member, and performed at the 1987 AES Music and Digital Audio Conference.

1 NeXT Preliminary 1.0 System Reference Manual
2 Jaffe,D., and Lee Boynton 1989, "An Overview of the Sound and Music Kits for the NeXT Computer", Computer Music Journal 13(2):48-55.

Figure 1. Ensemble performance control panel

Figure 2. Ensemble Virtual MIDI Clavier

Figure 3. An Ensemble document showing four types of instruments selected.

Figure 4. The class structure and flow of notes in Ensemble (NoteSenders and NoteReceivers not represented). Smallest boxes are object instances, labels indicate the classes and superclasses.

Figure 5. MIDI mapper note filter.

Figure 6. Harmonics generator note filter/performer.

Figure 7. Spatial location note filter.

Figure 8. Fractal melody generator note filter/performer.

Figure 9. Fractal function parameter inspector.

Figure 10. Music Kit parameter inspector for simple FM instrument.