Originally posted December 14, 2018 to littlelostfox.com/interactive-music
Hello! I’m Kyle Okaly, and I created all the music and sound throughout Valleys Between. In the past I’ve worked as both a film/game composer and a programmer, but rarely both on the same project. The team at Little Lost Fox has been amazing to work with for many reasons, but the one most relevant here is that they let me experiment with some cool ideas in programming the audio I created. As a result, we ended up with an interactive music system that is fairly interesting (in my very biased opinion!) and subtly deeper than it may initially seem.
Before we dove into discussing the technical aspects of how music would work in Valleys Between, the team and I figured out a general aesthetic that we would aim for. The consensus was something that we felt matched the tone of the visual art and gameplay best: something calm, simple, melodic, understated, and beautiful. I created a few mockups, we went back and forth with feedback and impressions, and we singled out a few themes we liked.
Once we had a pretty clear idea of what the music should sound like, I began thinking about interesting ways to deliver it. One of the goals for the audio in Valleys Between was to make the game feel more tactile and interactive, so I was very interested in borrowing some ideas from procedural music to base some aspect of our score on player input. Certain styles of music may lend themselves to procedural generation more naturally than others, and our choices didn’t seem like a great fit: we had melodic themes that we wanted to be recognizable, we had clear tonal centers and well-defined chord changes, and (maybe worst of all) it was important that the tone remained peaceful, meditative, relaxing—pretty much the opposite of "unpredictable".
Don’t get me wrong, I thought these were the perfect stylistic choices for the game, but I still wanted to figure out a way to make the music feel reactive and give the player the sense that they were guiding or shaping it in some way. I’ll spare you the details of every attempt along the way and summarize the system we ended up with, which I unofficially refer to as our "Musical Phrase System".
All player actions in Valleys Between are performed by either swiping up or down on a tile, so the entirety of player input can be thought of as a stream of swipes. We decided to connect the frequency of these swipes to the amount of ongoing musical activity. Since our goal was to maintain a relaxing, zen-like state, this system was less about "ramping up" the music during active periods and more about letting the music fade into the ambience of nature during periods of inactivity.
When gameplay begins, only ambient nature sounds can be heard. When the first player action occurs, the base layer of the first musical phrase begins playing. If another player action occurs before the phrase has finished, the appropriate seasonal layer is added. This achieves "maximum musical activity", which is also sort of "standard musical activity"—if you’re actively playing the game, you will likely be in this state for the majority of your experience.
When a musical phrase ends, the next phrase begins immediately. (They feel seamless—the player isn’t meant to hear them as separate phrases, but as one continuous musical track.) If no player action has occurred recently, the next phrase will begin with one less layer. When the final layer of a phrase is removed, instead of playing the next phrase, we trigger an inactive phrase that matches the chord of the next phrase. If a player action occurs during this inactive phrase, the next phrase is immediately triggered.
Since the inactive phrase doesn’t imply any rhythms, beginning the next phrase at any point during it feels natural, and since its notes work within the chord of the next phrase, any overlapping notes sound "correct". If an inactive phrase finishes without any player interaction, all music is once again silenced, and we are left with nature ambience until the next player action.
Check out the video at the top of this page to step through the process in real time.
While designing this system, we struggled to find the right balance between interactivity and music that felt calm and natural. I experimented with much shorter phrases—a few seconds and a few notes each—and while this felt like notes were more often triggered directly by player action, it also inevitably felt choppy and strange when actions were performed at certain intervals. It became clear that the music shouldn’t change much if actions were being performed often, so the best opportunity for direct musical feedback was after a period of inactivity.
This actually worked quite well with the spirit of the game. One of the main themes is developing a closer bond with nature, and early in the process we decided that the music should leave some room for ambient natural sounds to occasionally be the focal point. Rather than building softer segments or gaps into a pre-defined musical track, these nature-focused moments are now entirely user-defined. By simply stopping for a while to listen, the music will gradually become less complex, and eventually fade away completely.
As a side note, many sound effects are also programmed to take advantage of the musical phrase system: since each phrase is keeping track of the current chord, different variations of certain musical sound effects are played depending on the current chord.
I had a blast working on this with Little Lost Fox, and I learned a lot throughout the process. Creating something new based on an idea—a theoretical "that would be cool", rather than a concrete "this has worked in the past"—is always an adventure. No matter how well-formed the idea may seem, the end result is rarely what was expected going in, and the process of getting there is even harder to predict.