Academia.eduAcademia.edu

Outline

Real-time event sequencing without a visual interface

2013

Abstract

In electronic music, it is often useful to build loops from discrete events, such as playing notes or triggering digital effects. This process generally requires using a visual interface, as well as pre-defining tempo and time quantization. We present a novel digital musical instrument capable of looping events without using visual interfaces or explicit knowledge about tempo or time quantization. The instrument is built based on a prediction algorithm that detects repetitive patterns over time, allowing the construction of rhythmic layers in real-time performances. It has been used in musical performances, where it showed to be adequate in contexts that allow improvisation.

FAQs

sparkles

AI

What unique features differentiate the proposed instrument from traditional drum machines?add

The instrument allows looping event sequences without predefining tempo or quantization, providing a higher flexibility. Musicians can engage in improvisation similarly to tapping rhythms, enhancing creative expression.

How does the online learning algorithm enhance the instrument's performance?add

The online-learning algorithm predicts continuations of played sequences in real-time, adapting quickly to a musician's input. This results in a more natural flow of rhythm generation without reliance on strict timing.

What practical applications were demonstrated through the instrument's performances?add

It successfully facilitated multi-layer rhythms and improvisation in solo and duo performances, showcasing its flexibility. The performances revealed the instrument's ability to adapt dynamically without predefined constraints.

How does the system handle user-defined actions and event labels?add

Users can define event labels representing discrete musical gestures, enabling customized interaction for sound manipulation. Each musical event's onset and label allows varied expressions such as effects switching and drum patterns.

What limitations does the absence of a visual interface impose?add

The lack of a visual interface restricts offline composition and stored rhythm retrieval, as users cannot visually review or edit sequences. This enhances flexibility but may challenge those accustomed to traditional notation.

References (17)

  1. REFERENCES
  2. Hydrogen. Hydrogen advanced drum machine for gnu/linux. [Online]. Available: http://www. hydrogen-music.org/
  3. G. Sioros and C. Guedes, "Automatic rhyth- mic performance in Max/MSP: the kin. rhyth- micator," in Proceedings of the International Conference on New Interfaces for Musical Expression, Oslo, Norway, 2011, pp. 88-91.
  4. Online]. Available: http://www.nime2011.org/ proceedings/papers/B16-Sioros.pdf
  5. J. Harriman, "Sinkapater -an untethered beat se- quencer," in Proceedings of the International Con- ference on New Interfaces for Musical Expression (NIME), G. Essl, B. Gillespie, M. Gurevich, and S. O'Modhrain, Eds. Ann Arbor, Michigan: Uni- versity of Michigan, May 21-23 2012.
  6. P. Community. Rhythmdelay. [Online].
  7. AudioMulch. Sdelay. [Online].
  8. Avail- able: http://www.audiomulch.com/help/ contraption-ref/SDelay
  9. P. community. Puredata. [Online]. Available: http://puredata.info/
  10. J. J. Summers and T. M. Kennedy, "Strate- gies in the production of a 5 : 3 polyrhythm," Human Movement Science, vol. 11, no. 12, pp. 101 -112, 1992. [On- line]. Available: http://www.sciencedirect.com/ science/article/pii/016794579290053E
  11. C. E. Shannon, "Prediction and entropy of printed english," The Bell System Technical Journal, vol. 30, pp. 50-64, 1951.
  12. R. Solomonoff, "A formal theory of inductive in- ference, part i," Information and Control, vol. 7, no. 1, pp. 1-22, Mar 1964.
  13. --, "A formal theory of inductive inference, part ii," Information and Control, vol. 7, no. 2, pp. 224- 254, Jun 1964.
  14. G. Assayag, S. Dubnov, and O. Delerue, "Guess- ing the composer's mind: Applying universal pre- diction to musical style," in Proceedings ICMC 99, Beijing, China, 1999.
  15. M. Dahia, H. S. E. Trajano, G. Ramalho, C. San- droni, and G. Cabral, "Using Patterns to Generate Rhythmic Accompaniment for Guitar," in Proceed- ings of the SMC 2004, Paris, France, 2004.
  16. J. M. Martins and E. R. Miranda, "Breeding Rhythms with Artificial Life," in Proceedings of the SMC 2008, Berlin, Germany, 2008.
  17. G. Bernardes, C. Guedes, and B. Pennycook, "Style Emulation of Drum Patterns by Means of Evolutionary Methods and Statistical Analysis," in Proceedings of the SMC 2010, Oslo, Norway, 2010. [Online]. Available: http://smcnetwork.org/ files/proceedings/2010/26.pdf