w at williambrent dot com

William Brent is a computer musician and Assistant Professor of Audio Technology at American University in Washington DC. His creative work is spread across the areas of experimental music performance, sound art, and instrument design, and involves various combinations of human- robotic- and computer-realized sound. In collaboration with internationally recognized composers and performers, he develops and operates real-time audiovisual manipulation software for inter-media performance works, such as James Dillon’s Nine Rivers, and Philippe Manoury’s Jupiter, Pluton, and Neptune. In this capacity, he has presented work at venues such as SESC (São Paulo), Glasgow Concert Halls (Scotland), Miller Theatre (New York), and the National Gallery of Art (Washington, DC). As a programmer, Brent has developed open source software libraries for the Pure Data [Pd] programming environment that are used by an international community of artists and researchers. His current lines of research include new methods for physical control of synthesized audio, signal analysis techniques for quantifying timbre, and various aspects of human timbre perception.

Brent studied piano performance and composition at Wilfrid Laurier University and Mills College, earning Bachelor and Master of Arts degrees in Music. He holds a Ph.D in Music from the University of California, San Diego, where he studied in the computer music area with Miller Puckette, F.R. Moore, and Shlomo Dubnov. Centered on various understandings of timbre, his dissertation research examined signal processing techniques for automatic classification of percussion instruments, and the relationship between objective measurements and human judgments of percussive sounds.

In parallel with his dissertation work on timbre, Brent developed the timbreID software library – an open source suite of Pure Data objects for real-time and offline audio analysis and identification. He has authored various other tools for use in Pd, including DILib, which is geared toward facilitating the creation of novel digital musical instruments. Using these libraries, he recently developed the Open Shaper (a digital musical instrument controlled by shaping a virtual polygon with open-air fingertip movements), and the Gesturally Extended Piano, an extended instrument that exploits large-scale arm motions associated with standard piano technique in order to control real-time sound processing, spatialization, and synthesis. His work has been presented at major venues in the field of computer music, such as: the International Computer Music Conference, the International Conference on New Interfaces for Musical Expression, and the International Conference on Auditory Display.