Papers by Richard Polfreman

Hand posture recognition: IR, sEMG and IMU
Hands are important anatomical structures for musical performance, and recent developments in inp... more Hands are important anatomical structures for musical performance, and recent developments in input device technology have allowed rather detailed capture of hand gestures using consumer-level products. While in some musical contexts, detailed hand and finger movements are required, in others it is sufficient to communicate discrete hand postures to indicate selection or other state changes. This research compared three approaches to capturing hand gestures where the shape of the hand, i.e. the relative positions and angles of finger joints, are an important part of the gesture. A number of sensor types can be used to capture information about hand posture, each of which has various practical advantages and disadvantages for music applications. This study compared three approaches, using optical, inertial and muscular information, with three sets of 5 hand postures (i.e. static gestures) and gesture recognition algorithms applied to the device data, aiming to determine which methods are most effective.

Zenodo (CERN European Organization for Nuclear Research), Jun 7, 2022
Graphical interpolators provide a simple mechanism for synthesis-based sound design by offering a... more Graphical interpolators provide a simple mechanism for synthesis-based sound design by offering a level of abstraction above the synthesis parameters. These systems supply users with two sensory modalities in the form of sonic output from the synthesis engine and visual feedback from the interface. A number of graphical interpolator systems have been developed over the years that provide users with different visual cues, via the graphical display. This study compares user interactions with six interpolation systems that have alternative visualizations, in order to investigate the impact that the interface's different visual cues have on the process of locating sounds within the space. We also present a dimension space analysis of the interpolators and compare this with the user studies to explore its predictive potential in evaluating designs. The outcomes from our study help to better understand design considerations for graphical interpolators and will inform future designs.
The Interactive Music Awareness Programme (IMAP) for cochlear implant users - Online Web Resource
The aim of the IMAP is to aid music appreciation and it was developed using a participatory desig... more The aim of the IMAP is to aid music appreciation and it was developed using a participatory design approach with adult cochlear implant (CI) users from the University of Southampton Auditory Implant Service and members of the UK National Cochlear Implant Users Association
An introduction to musicSpace

Proceedings of the SMC Conferences, May 28, 2019
This paper presents a framework that supports the development and evaluation of graphical interpo... more This paper presents a framework that supports the development and evaluation of graphical interpolated parameter mapping for the purpose of sound design. These systems present the user with a graphical pane, usually two-dimensional, where synthesizer presets can be located. Moving an interpolation point cursor within the pane will then create new sounds by calculating new parameter values, based on the cursor position and the interpolation model used. The exploratory nature of these systems lends itself to sound design applications, which also have a highly exploratory character. However, populating the interpolation space with "known" preset sounds allows the parameter space to be constrained, reducing the design complexity otherwise associated with synthesizer-based sound design. An analysis of previous graphical interpolators is presented and from this a framework is formalized and tested to show its suitability for the evaluation of such systems. The framework has then been used to compare the functionality of a number of systems that have been previously implemented. This has led to a better understanding of the different sonic outputs that each can produce and highlighted areas for further investigation.

Personal and Ubiquitous Computing, 2020
Graphical interpolation systems provide a simple mechanism for the control of sound synthesis by ... more Graphical interpolation systems provide a simple mechanism for the control of sound synthesis by providing a level of abstraction above the engine parameters, allowing users to explore different sounds without awareness of the underlying details. Typically, a graphical interpolator presents the user with a two-dimensional pane where a number of synthesizer presets, each representing a collection of synthesis parameter values, can be located. Moving an interpolation cursor within the pane results in the calculation of new parameter values, based on its position, the relative locations of the presets, and the mathematical interpolation function, thus generating new sounds. These systems supply users with two sensory modalities in the form of sonic output and visual feedback from the interface. A number of graphical interpolator systems have been developed over the years, with a variety of user-interface designs, but few have been subject to formal user evaluation. Our testing studied ...

Organised Sound, 2002
Modalys-ER is a graphical environment for creating physical model instruments and generating musi... more Modalys-ER is a graphical environment for creating physical model instruments and generating musical sounds with them. While Modalys-ER provides users with a relatively simple-to-use interface, it has only limited methods for mapping control data onto model parameters for performance. While these are sufficient for many interesting applications, they do not bridge the gap from high-level specifications such as MIDI files or Standard Western Notation (SWN) down to low-level parameters within the physical model. With this issue in mind, a part of Modalys-ER has now been ported to OpenMusic, providing a platform for developing more sophisticated automation and control systems that can be specified through OpenMusic's visual programming interface. An overview of the MfOM library is presented and illustrated with several musical examples using some early mapping designs. Also, some of the issues relating to building and controlling virtual instruments are discussed and future directions for research in this area are suggested. The first release is now available via the IRCAM Software Forum.

Although modern software-based DAWs (Digital Audio Workstations) offer the ability to interconnec... more Although modern software-based DAWs (Digital Audio Workstations) offer the ability to interconnect with plug-in effects, they can be restrictive due to their architecture being largely based on hardware mixing desks. This is especially true when complex multi-effect sound design is required. This paper aims to demonstrate how a plug-in that can host other effects plug-ins can help improve the sound design possibilities in a DAW. This hosting plug-in allows other effects to be "inserted" at specific points in its internal signal flow. Details are given of a "proof of concept" plug-in that was created to demonstrate that it was possible to create plug-ins that can host other plug-ins, using Apple's AU (Audio Unit) format. The proof of concept is a delay effect that allows other effects plug-ins to be inserted in either the "delay path", "feedback path" or both. This Audio Unit has been extensively tested using different DAWs and has been found to work successfully in a variety of situations. Finally, details are given of how improvements can be made to the plug-in hosting delay.

Comparing onset detection and perceptual attack time
Accurate performance timing is associated with the perceptual attack time (PAT) of notes, rather ... more Accurate performance timing is associated with the perceptual attack time (PAT) of notes, rather than their physical or perceptual onsets (PhOT, POT). Since manual annotation of PAT for analysis is both time-consuming and impractical for real-time applications, automatic transcription is desirable. However, computational methods for onset detection in audio signals are conventionally measured against PhOT or POT data. This paper describes a comparison between PAT and onset detection data to assess whether in some circumstances they are similar enough to be equivalent, or whether additional models for PAT-PhOT difference are always necessary. Eight published onset algorithms, and one commercial system, were tested with five onset types in short monophonic sequences. Ground truth was established by multiple human transcription of the audio for PATs using rhythm adjustment with synchronous presentation, and parameters for each detection algorithm manually adjusted to produce the maximum agreement with the ground truth. Results indicate that for percussive attacks, a number of algorithms produce data close to or within the limits of human agreement and therefore may be substituted for PATs, while for non-percussive sounds corrective measures are necessary to match detector outputs to human estimates.
This Mini-Thesis submitted in partial fulfilment of the requirements for the degree (Advanced Psy... more This Mini-Thesis submitted in partial fulfilment of the requirements for the degree (Advanced Psychiatric Mental Health Nursing) in the Faculty of Community and Health Sciences, University of the Western Cape.
Organised Sound, 2001
The musical use of realtime digital audio tools implies the need for simultaneous control of a la... more The musical use of realtime digital audio tools implies the need for simultaneous control of a large number of parameters to achieve the desired sonic results. Often it is also necessary to be able to navigate between certain parameter configurations in an easy and intuitive way, rather than to precisely define the evolution of the values for each parameter. Graphical interpolation systems (GIS) provide this level of control by allocating objects within a visual control space to sets of parameters that are to be controlled, and using a moving cursor to change the parameter values according to its current position within the control space. This paper describes Interpolator, a two-dimensional interpolation system for controlling digital signal processing (DSP) parameters in real time.

Organised Sound, 1999
This paper presents an overview of a generic task model of to produce a generic description of mu... more This paper presents an overview of a generic task model of to produce a generic description of music compomusic composition, developed as part of a research project sition tasks that could then be used to assist the investigating methods of improving user-interface designs software design process. The generic task model for music software (in particular focusing on sound (GTM) developed aided our understanding of the synthesis tools). The task model has been produced by nature of music composition tasks, the environment applying recently developed task analysis techniques to the within which tasks are typically carried out, and how complex and creative task of music composition. The model these tasks are organised collectively. A summary of itself describes the purely practical aspects of music early results can be found in Polfreman and composition, avoiding any attempt to include the aesthetic Sapsford-Francis (1995), while a complete description motivations and concerns of composers. We go on to of this research can be found in Polfreman (1997b). illustrate the application of the task model to software design by describing various parts of Modalyser, a Modalyser, a graphical environment for sound graphical user-interface program designed by the author for synthesis with IRCAM's Modalys, has been develcreating musical sounds with IRCAM's Modalys physical oped using user-interface ideas emerging from the modelling synthesis software. The task model is not yet GTM, particularly in terms of its structure and complete at all levels and requires further refinement, but is approach to describing musical elements. It is curdeemed to be sufficiently comprehensive to merit rently a working prototype that has been made freely presentation here. Although developed for assisting in available to users of Modalys (Morrison and Adrien software design, the task model may be of wider interest to 1993) via the IRCAM Software Forum. Modalyser is those concerned with the education of music composition not yet complete in terms of providing all the funcand research into music composition generally. This paper
The musicSpace Project
David Bretherton (D.Bretherton@soton.ac.uk), mc schraefel (PI), Daniel Alexander Smith, Richard P... more David Bretherton (D.Bretherton@soton.ac.uk), mc schraefel (PI), Daniel Alexander Smith, Richard Polfreman, Mark Everist, Jeanice Brooks, Joe Lambert. ... . . . but what if they could use just one? ... Musicological data is segregated into numerous digital repositories.

Proceedings of the International Conference on New Interfaces for Musical Expression, Jun 1, 2020
This paper presents a new visualization paradigm for graphical interpolation systems, known as St... more This paper presents a new visualization paradigm for graphical interpolation systems, known as Star Interpolation, that has been specifically created for sound design applications. Through the presented investigation of previous visualizations, it becomes apparent that the existing visuals in this class of system, generally relate to the interpolation model that determines the weightings of the presets and not the sonic output. The Star Interpolator looks to resolve this deficiency by providing visual cues that relate to the parameter space. Through comparative exploration it has been found this visualization provides a number of benefits over the previous systems. It is also shown that hybrid visualization can be generated that combined benefits of the new visualization with the existing interpolation models. These can then be accessed by using an Interactive Visualization (IV) approach. The results from our exploration of these visualizations are encouraging and they appear to be ...

Proceedings of the 19th Sound and Music Computing Conference, June 5-12th, 2022, Saint-Étienne (France), 2022
Graphical interpolators provide a simple mechanism for synthesis-based sound design by offering a... more Graphical interpolators provide a simple mechanism for synthesis-based sound design by offering a level of abstraction above the synthesis parameters. These systems supply users with two sensory modalities in the form of sonic output from the synthesis engine and visual feedback from the interface. A number of graphical interpolator systems have been developed over the years that provide users with different visual cues, via the graphical display. This study compares user interactions with six interpolation systems that have alternative visualizations, in order to investigate the impact that the interface's different visual cues have on the process of locating sounds within the space. We also present a dimension space analysis of the interpolators and compare this with the user studies to explore its predictive potential in evaluating designs. The outcomes from our study help to better understand design considerations for graphical interpolators and will inform future designs.

Force Motion
Proceedings of the 6th International Conference on Movement and Computing
We present preliminary results from an ongoing project at the University of Southampton that aims... more We present preliminary results from an ongoing project at the University of Southampton that aims to develop protocols for motion capture with music conducting. These protocols will facilitate the study of conducting gestures, provide a high-quality open-access data set using professional conductors and provide a platform for developing machine learning for conductor following systems. In this paper we explore the potential use of force-plate data to track conductors' beats as a non-intrusive method for conductor following. Three conductors were captured directing the same piece of music and we analysed a section of the piece where the conductors are working with a click-track to ensure the intended beats share the same timing and there is an inherent ground truth. We then examined the data from force plate and high-end optical marker tracking, against observer beat tapping and click audio to determine whether force plate data could serve as a useful analogue in conductor following. The results suggest that with simple analysis of the data, beats can be extracted with comparable timing accuracy to optical marker tracking.
Distance measures for sound similarity based on auditory representations and dynamic time warping
... In, Proceedings of the VII International Symposium on Systematic and ComparativeMusicology II... more ... In, Proceedings of the VII International Symposium on Systematic and ComparativeMusicology III International Conference on Cognitive Musicology. VII International Symposium on Systematic and Comparative Musicology ...
Music has a long tradition of using electronic technology in performance – from early electronic ... more Music has a long tradition of using electronic technology in performance – from early electronic instruments such as the Theremin, through to today’s plethora of digital devices. With rapid technological change, there is always the risk that important works using live electronics can be become un-performable over time, due to the lack of working equipment available, lack of the associated expertise necessary to operate the equipment, or damaged/obsolete storage media containing the performance data. This article reports on some of our experience of reworking pieces for newer technology and preliminary work on a research project examining the problem in general.
Uploads
Papers by Richard Polfreman