From Authored to Produced Time in Computer-Musician Interactions
2013
Sign up for access to the world's latest research
Abstract
Human musicians have since long developed methods and formalisms for ensemble authoring and real-time coordination and synchronization of their actions. Bringing such capabilities to computers and providing them with the ability to take part in musical interactions with human musicians, poses interesting challenges for authoring of time and interaction and real-time coordination that we address in this paper in the context of Mixed Music and the Antescofo project.
Related papers
Proceedings of the SMC, 2009
Discrete Event Dynamic Systems: Theory and Applications, 2013
With the advent and availability of powerful personal computing, the computer music research and industry have been focusing on real-time musical interactions between musicians and computers; delegating human-like actions to computers who interact with a musical environment. One common use-case of this kind is Automatic Accompaniment where the system is comprised of a real-time machine listening system that in reaction to recognition of events in a score from a human performer, launches necessary actions for the accompaniment section. While the real-time detection of score events out of live musicians' performance has been widely addressed in the literature, score accompaniment (or the reactive part of the process) has been rarely discussed.
Transforming Human Experience Through Symbiotic Technologies, 2000
For years, a major challenge among artists and scientists has been the construction of music systems capable of interacting with humans on stage. Such systems find applicability in contexts within which they are required to act both as independent, improvising agents and as instruments in the hands of a musician. This is widely known as the player and the instrument paradigm.During the last years, research on Machine Improvisation has made important steps towards intelligent, efficient and musically sensible systems that in many cases can establish an interesting dialog with a human instrument player. Most of these systems require, or at the very least encourage, the active participation not only of instrument players but also of computer users-operators that conduct actions in a certain way. Still, very little has been done towards a more sophisticated interaction model that would include not only the instrument player and the machine but also the computer operator, who in this case should be considered as a computer performer.In this paper we are concerned with those aspects of enhanced interactivity that can exist in a collective improvisation context. This is a context characterized by the confluent relationship between instrument players, computers, as well as humans that perform with the help of the computer, onstage. The paper focuses on the definition of a theoretical as well as a computational framework for the design of modern machine improvisation systems that will leave the necessary space for such parallel interactions to occur in real-time. We will study the so-called three party interaction scheme based on three concepts: first, with the help of the computer, provide an active role for the human participant, either as an instrument player or as a performer. Secondly, create the framework that will allow the computer being utilized as an augmented, high-level instrument. Lastly, conceive methods that will allow the computer to play an independent role as an improviser-agent who exhibits human music skills.
2004
Across the Centuries, musicians have always had interest in the latest scientific achievements and have used the latest technologies of their time to produce musical material. Since the mid-20th Century, the use of computing technology for music production and analysis has been increasingly common among music researchers and composers. Continuing this trend, more recently, the use of network technologies in the field of Computer Music turned out to be a natural research goal. Our group is investigating the use of ...
Interfaces, 2011
This paper provides a discussion of how the electronic, solely IT based composition and performance of electronic music can be supported in realtime with a collaborative application on a tabletop interface, mediating between single-user style music composition tools and co-located collaborative music improvisation. After having elaborated on the theoretical backgrounds of prerequisites of co-located collaborative tabletop applications as well as the common paradigms in music composition/notation, we will review related work on novel IT approaches to music composition and improvisation. Subsequently, we will present our prototypical implementation and the results.
2016
Musical duets are a type of creative partnership with a long history of artistic practice. What can they tell us about creative partnerships between a human and a computer? To explore this question we implemented an activity-based model of duet interaction in software designed to support musical metacreation and investigated the experience of performing with it. The activity-based model allowed for the application of reflexive interactive processes, previously used in dialogic interaction, to a synchronous musical performance context. The experience of improvising with the computational agent was evaluated by expert musicians, who reported that interactions were fun, engaging, and challenging, despite some obvious limitations in the musical sophistication of the software. These findings reinforce the idea that even simple metacreative systems can stimulate creative partnerships and, further, that creative human-machine duet partnerships may well produce, like human-human duet partnerships, more than the sum of their parts. 1. INTRODUCTION This article investigates human-computer creative partnerships in the context of musical duet performance. We describe a computational music agent, CIM, designed for duet performance with a human musician, and evaluate the effectiveness of the system at stimulating an engaging musical interaction and engendering a sense of human-computer collaboration. We view human-computer creative partnerships involving musical metacreation as a particular kind of human-computer interaction, in which the computer has a degree of agency, and the phenomenological experience of the interaction includes elements of partnership, cooperation, and negotiation [Jones et al. 2012]; contrasting with instrumental approaches to creativity support systems, where the computer is functioning as a tool [Shneiderman et al. 2006]. CIM utilises an activity-based model of interaction, where the musical outputs of both performers (human and computer) are categorised into a few discrete activities, according to the relationship between the current output and previous output from either performer. This facilitates description of temporal structure in the performance, representing inter-part and intra-part relationships separately from the representation of surface musical content. In order to investigate the experience of metacreative musical duets, we collated and compared subjective impressions of experienced human musicians interacting with CIM. We employed a mixed quantitative/qualitative survey instrument probing the performers' experience of interaction. Through analysis of this data we identified aspects of CIM's behaviour that influenced its effectiveness as a musical collaborator, from which we extrapolate to suggest interaction approaches that may foster human-computer creative partnerships more broadly. The results of the evaluation suggested that the system was effective at stimulating an engaging musical collaboration. Striking a balance between unexpectedness and predictability emerged as a key factor, as might be expected. The combination of a reflexive approach with an activity-based interaction model appears to have been an effective platform for mediating these opposing tendencies, allowing musically meaningful engagement with the system, despite its musical 'knowledge' being quite limited.
Computer music systems that coordinate or interact with human musicians exist in many forms. Often, coordination is at the level of gestures and phrases without synchronization at the beat level (or perhaps the notion of “beat” does not even exist). In music with beats, fine-grain synchronization can be achieved by having humans adapt to the computer (e.g. following a click track), or by computer accompaniment in which the computer follows a predetermined score. We consider an alternative scenario in which improvisation prevents traditional score following, but where synchronization is achieved at the level of beats, measures, and cues. To explore this new type of human-computer interaction, we have created new software abstractions for synchronization and coordination of music and interfaces in different modalities. We describe these new software structures, present examples, and introduce the idea of music notation as an interactive musical interface rather than a static document.
2010
Coordination between ensembles of improvising electroacoustic musicians is a special case of the larger HCI problem of coordinating joint, real-time activity; one that involves some interesting additional and different challenges. This paper reports on research that has identified two specific real-time coordination problems for ensembles of electroacoustic musicians: "who makes what sound? " and "how is the sound being altered? " Real-time sound visualization is explored as a possible solution to assist musicians in overcoming some of these challenges. The main contribution of this paper is that, counterintuitively, for certain kinds of joint, real-time, coordination activities, temporal representations are important in helping to determine "who did what?"
Organised Sound, 2014
This article explores listening and communications strategies that arise with a collaborative scoring system we are developing for use within improvisational contexts. Performers generate notation on a scrolling score a short time before it is played or rendered into sound. Working a short time in the future allows performers to respond to sound as they would in any improvisatory situation, and yet coordinate their activity through notation in a way typically associated with pre-composed music. The ‘Anticipatory Score’ platform supports the exploration of different kinds of relationships between performers, composers and audience members, and different listening and engagement strategies that affect the musical experience for all participants.
Computer Music Journal, 2014
Computers have the potential to significantly extend the practice of popular music based on steady tempo and mostly determined form. There are significant challenges to overcome, however, due to constraints including accurate timing based on beats and adherence to a form or structure despite possible changes that may occur, possibly even during performance. We describe an approach to synchronization across media that takes into account latency due to communication delays and audio buffering. We also address the problem of mapping from a conventional score with repeats and other structures to an actual performance, which can involve both "flattening" the score and rearranging it, as is common in popular music. Finally, we illustrate the possibilities of the score as a bidirectional user interface in a real-time system for music performance, allowing the user to direct the computer through a digitally displayed score, and allowing the computer to indicate score position back to human performers.

Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
References (9)
- Alur, R., and Dill, D. L. A theory of timed automata. Theor. Comput. Sci. 126 (1994), 183-235.
- André, É., Fribourg, L., Kühne, U., and Soulat, R. IMITATOR 2.5: A tool for analyzing robustness in scheduling problems. In 18th Int. Symp. on Formal Methods, vol. 7436 of LNCS, Springer (2012), 33-36.
- Berry, G., and Cosserat, L. The esterel synchronous programming language and its mathematical semantics. In Seminar on Concurrency, Springer (1985), 389-448.
- Boulez, P. Penser la Musique Aujourd'hui. 1964.
- Bouyer, P., Fahrenberg, U., Larsen, K. G., and Markey, N. Quantitative analysis of real-time systems using priced timed automata. Com. ACM 54, 9 (2011), 78-87.
- Cont, A. Antescofo: Anticipatory synchronization and control of interactive parameters in computer music. In Proc. of Int. Comp. Music Conf. (ICMC) (8 2008).
- Cont, A. A coupled duration-focused architecture for realtime music to score alignment. IEEE Trans. Patt. Anal. and Mach. Intelligence 32, 6 (2010), 974-987.
- Cont, A., Echeveste, J., Giavitto, J.-L., and Jacquemard, F. Correct Automatic Accompaniment Despite Machine Listening or Human Errors in Antescofo. In ICMC (2012).
- Echeveste, J., Cont, A., Jacquemard, F., and Giavitto, J.-L. Antescofo: A domain specific language for realtime musician-computer interaction. Discrete Event Dynamic Systems (2012). (submitted).