Papers by Elizabeth Medina-Gray
Meaningful Modular Combinations: Simultaneous Harp and Environmental Music in Two Legend of Zelda Games
Music in Video Games: Studying Play (Routledge), 2014
Dissertation by Elizabeth Medina-Gray

One of the most critical aspects of music in video games is that it is dynamic. This music-along ... more One of the most critical aspects of music in video games is that it is dynamic. This music-along with a game's visuals, events, and other audio-is flexible across gameplay, and its progression depends on various real-time factors, including the player's individual actions. For example, the player may move her character to a new area in the game's virtual world and the musical score might change in response; similarly, a particular in-game event may trigger a brief motive that sounds on top of the game's other ongoing music. As each player's experience with a particular game is unique, so each real-time musical soundtrack shapes itself-to varying degrees-to match each individualized play session. Music is an integral part of gameplay and the larger multimedia object of the game, but this media's dynamic quality challenges the study of video games and game music in ways that are unfamiliar in studies of more static media, such as most concert music and film. In particular, how can we study music whose final content and structure is unknown until the moments of gameplay? How can we equally treat each real-time soundtrack-and its contributions to gameplay-for an infinite number of individual players?
Book Reviews by Elizabeth Medina-Gray
Sound Play: Video Games and the Musical Imagination, by William Cheng
Talks by Elizabeth Medina-Gray

Sound Effects as Music (or Not): Earcons and Auditory Icons in Video Games
Defining “music” in video games may at first seem to be unproblematic. Familiarly, music involves... more Defining “music” in video games may at first seem to be unproblematic. Familiarly, music involves sustained organization of sound across time according to structures of pitch and/or rhythm; in video games, the score that accompanies gameplay typifies this category. However, many sound effects in games—brief sounds tied to gameplay actions or events through what Karen Collins (2013) has called kinesonic synchresis—are also musical in that they contain pitch or rhythm; Collins, Reale (2014), and others have pointed out that sound effects can even become part of a game’s music in particular contexts. Sound effects provide an important channel of communication between player and game. When such sounds become music, what implications might this status have for interactive gameplay?
This paper introduces a framework in which to consider the relative musicality of sound effects in games. First, this paper adopts a distinction from the field of Human-Computer Interaction between two types of sound effects that convey information to the user: auditory icons—naturalistic sounds with pre-existing associations—and earcons—abstract sequences of tones. Individual earcons are typically musical and auditory icons are typically non-musical, but neither are, by themselves, music. Next, this paper considers sound effects in their wider context, and especially together with a game’s musical score, drawing on the author’s (2014) analytical method for gauging smoothness between layers of game soundtracks. This paper examines several examples from various games to suggest ways in which sound effects’ status as music—or not—might subtly or significantly impact gameplay.
Analyzing Modular Smoothness in Video Game Music
Analyzing Modular Smoothness in Video Game Music
Chance and Choice in the Assembly of Video Game Soundtracks
Modularity and Dynamic Play: Video Game Music and its Avant-garde Antecedents
Modularity and Dynamic Play: Video Game Music and its Avant-garde Antecedents
Combat Music and Transitional Seams: Toward a Theory of Musical Modularity in 21st-Century Video Games
Uploads
Papers by Elizabeth Medina-Gray
Dissertation by Elizabeth Medina-Gray
Book Reviews by Elizabeth Medina-Gray
Talks by Elizabeth Medina-Gray
This paper introduces a framework in which to consider the relative musicality of sound effects in games. First, this paper adopts a distinction from the field of Human-Computer Interaction between two types of sound effects that convey information to the user: auditory icons—naturalistic sounds with pre-existing associations—and earcons—abstract sequences of tones. Individual earcons are typically musical and auditory icons are typically non-musical, but neither are, by themselves, music. Next, this paper considers sound effects in their wider context, and especially together with a game’s musical score, drawing on the author’s (2014) analytical method for gauging smoothness between layers of game soundtracks. This paper examines several examples from various games to suggest ways in which sound effects’ status as music—or not—might subtly or significantly impact gameplay.