Real-time musical applications using the FFT-based resynthesis
Abstract
The Fast Fourier Transform (FFT) is a powerful general-purpose algorithm widely used in signal analysis. FFTs are useful when the spectral information of a signal is needed, such as in pitch tracking or vocoding algorithms. The FFT can be combined with the Inverse Fast Fourier Transform (IFFT) in order to resynthesize signals based on its analyses.This application of the FFT/IFFT is of great interest in electro-acoustic music because it allows for a high degree of control of a given signal's spectral information (an important aspect of timbre) allowing for flexible, and efficient implementation of signal processing algorithms. Real-time implementations of the FFT and IFFT are of particular interest since they may be used to provide musicians with highly responsive and straight-forward means for generating and controlling sound in live-performance situations. This paper presents musical applications using the IRCAM Signal Processing Workstation (ISPW) that make use of FFT/IFFT-based resynthesis for timbral transformation in real time. An intuitive and straightforward user interface, intended for use by musicians, has been developed by the authors in the Max programming environment. Techniques for filtering, cross-synthesis, noise reduction, dynamic spectral shaping, and resynthesis are presented along with control structures that allow for fine timbral modification and control of complex sound transformations using few parameters. Emphasis is also placed on developing control structures derived from real-time analyses (time and frequency domain) of a musician's input. The ideas and musical applications, discussed in this paper, offer composers an intuitive approach to timbral transformation in electro-acoustic music, and new possibilities in the domain of live signal processing that promise to be of general interest to musicians.
References (8)
- Chowning, J. 1973. "The Synthesis of Complex Audio Spectra by means of Frequency Modula- tion" Journal of the Acoustical Society of Amer- ica 21(7), 526-534.
- Dolson, M. 1986. "The phase vocoder: a tutorial", Computer Music Journal, 10(4), 14-27
- Gordon, J. and Strawn J. 1987. "An introduction to the phase vocoder", Proceedings, CCRMA, Department of Music, Stanford University, February 1987.
- Haddad R, and Parsons, T. 1991. "Digital Signal Processing, Theory, Applications and Hardware", New York: Computer Science Press Heide van der, E. 1993. Private communication.
- Lindemann, E., Starkier, M., and Dechelle, F. 1991. "The Architecture of the IRCAM Music Workstation." Computer Music Journal 15(3), 41-49.
- Lippe, C. and Puckette, M. 1991. "Musical Per- formance Using the IRCAM Workstation", Proceedings of the 1991 International Computer Music Conference. San Francisco: International Computer Music Association. Moorer and Berger, 1984. "Linear-Phase Bandsplitting: Theory and Applications", Audio Engineering Society (preprint #2132), New York: 76th AES convention 1984.
- Nieberle, R and Warstat, M 1992. "Implementation of an analysis/synthesis system on a DSP56001 for general purpose sound pro- cessing", Proceedings of the 1992 International Computer Music Conference. San Francisco: In- ternational Computer Music Association. Puckette, M. 1988. "The Patcher." Proceedings of the 1988 International Computer Music Confer- ence. San Francisco: International Computer Music Association.
- Puckette, M., 1991. "FTS: A Real-time Monitor for Multiprocessor Music Synthesis." Music Conference, 420-429, San Francisco: Interna- tional Computer Music Association. note: A different version of this article will be published by Harwood Academic Publishers (Switzerland).