The iMuse system really is remarkable. Games like X-Wing took great advantage of the features, when a Star Destroyer jumps into the game the music would seamlessly transition to the imperial March and it felt just like being in the movies. I don't think any modern system even tries to do those seamless transitions from one music piece to another.
One thing I wonder about .. he mentions CD-audio (Redbook?) as being one capability of the system. But the CD-Audio games like X-Wing vs Tie Fighter were much more limited in that sense. You'd literally just hear the music switch to the new track. And the Force Unleashed, the last game that used iMuse, wasn't particularly remarkable if memory serves. I wonder if that was a limitation they just couldn't quite make as seamless?
I figure today you could do it, with a "virtual MIDI" system using MP3 audio of individual instrument sounds ..
Edited to add: that last sentence is essential what a DAW provides.
Also done very well in Monkey Island 2 with iMuse as well, in which a lot of care was taken in transitioning music with custom bridges. It was quite subtle and lovely, and is considered a high point in video game music.
Games today feature dynamic music with loops and transitions and individual stems that can be remixed at runtime. One prominent example (to me, at least) is "Take Control" playing over the Ashtray Maze in Control. This sounds like an absolutely seamless prog metal song while playing, but it is actually highly reactive to the gameplay - the rapid-fire sequence of battle arenas and fast paced corridors. The player stays in absolute control of the pacing the whole time.
Tetris Effect is also a great example of this. Each movement and rotation of pieces impacts the score and each level has varied genres. One of my favorites is the New York City Jazz level.
Hi-Fi Rush did some of the opposite: the gameplay in certain parts shrinks or stretches so it takes the right amount of time to hit the next musical cue.
Final Fantasy XIV does this a lot. Boss fights' music will often change depending on what phase of the fight you're in, and in some the music will gradually transition to the heroic themes at the right moment.
> I wonder if that was a limitation they just couldn't quite make as seamless?
It's a fundamental limitation of CD audio. There isn't enough buffering to keep playing sound while the laser seeks to the next track, so there must be a gap. The gap isn't even predictable, the seek time will vary from drive to drive and even vary on the same drive.
With CD audio, your CD-ROM drive actually switches mode to become a regular CD player. The digital samples don't get sent to your sound card, the drive actually has all the electronics required to decode the digital audio and convert it to analog. All your sound card does is mix the analog output from your CD ROM drive with everything else.
The game can only really send "skip to track" style commands to the drive, more or less the same set of commands you could send with a proper CD player's remote.
CD and other formats create trade-offs vs MIDI event sequences - it's a simple playback method offering a lot of fidelity but in exchange, you're tied to having either "one track at a time and the CD spins up in between" (Redbook CD), cueing uncompressed sampled tracks(feasible but memory intensive) or cueing one or more lossy-compressed streams(which added performance or hardware-specific considerations at the time, and in many formats, also limits your ability to seek to a particular point during playback or do fine-grained alterations with DSP). So as a dynamic music system it tends to lend itself to brief "stings" like the Half-Life 1 soundtrack, or simple explore/combat loops that crossfade or overlay on each other. Tempo and key changes have been off the table, at least up until recently(and even then, it really impacts sound quality). DJ software offers the best examples of what can be done when combining prerecorded material live and there are some characteristic things about how DJs perform transitions and mashups which are musically compelling but won't work everywhere for all material.
MIDI isn't really that much better, though - it's a compatibility-centric protocol, so it doesn't get at the heart of the issue with dynamic audio of "how do I coordinate this". All it is responsible for is an abstract "channel, patch number, event" system, leaving the details involved in coordinating multiple MIDI sequences and triggering appropriate sounds to be worked out in implementation. An implementation that does everything a DAW does with MIDI sequences has to also implement all the DSP effects and configuration surfaces, which is out of scope for most projects, although FMOD does enable something close to that.
I think the best approach for exploring dynamic and interactive right now is really to make use of systems that allow for live coding - Pure Data, Supercollider, etc. These untangle the principal assumptions of "either audio tracks or event sequences" and allow choice, making it more straightforward to coordinate everything centrally, do some synthesis or processing, some sequencing, adopt novel methods of notation. The downside is that these are big runtimes with a lot of deployment footprint, so they aren't something that people just drop into game engines.
X-Wing vs Tie Fighter: available memory and CD seek performance really restricted what you could do then.
The Force Unleashed: this is one of those "succeeds if it's invisible" things. The music is procedural based on mixing rhythmic and arrhythmic stems. That allowed continuous cross fades without needing to precisely match beats. That was a limitation again of not being able to precisely line up stems. The other fun thing that was introduced was physics driven synthesis. The DMM system fed information about strain, impacts, and other events into a granular synthesizer. The bussing and ducking architecture was derived from this paper by Walter Murch: https://transom.org/wp-content/uploads/2005/04/200504.review... Fun anecdote: I was at a party with some audio nerds, and raving about the paper to a new acquaintance, who interrupted me and said, "Oh, I wrote that!"
From a musical theater composition perspective, it's almost like building around vamp sections: https://romanbenedict.com/vamp-safety-repeat/ - you build a neutral, repeatable motif that you can easily lay under unpredictably timed segments (e.g. spoken dialogue), that's primed to "explode" into a memorable melody whenever the on-stage timing calls for it!
What made these games different was that the musical themes were significant and well known long before you installed your SoundBlaster. The music was mixed at high intensity out of the box allowing it to influence you, each track tailored to the moment.
This gave the series a leg up in that the music could actually communicate information effectively -- a tense moment, the shifting tide of the battle, the calm after a victory -- whereas other games simply had to put up waveforms that sounded pleasing.
To be fair many games experimented with sound design in this era, but few had such legendary IP to build with. An unfair advantage to say the least. The folks wielding iMUSE clearly knew what they had.
I might deceive myself, but as I recall, I was vary satisfied with how world of warcraft handled environmental music changes. imuse may be a whole other level though.
> I don't think any modern system even tries to do those seamless transitions from one music piece to another.
You will be pleased to hear that plenty of games since then have continued to use that same technique, and there are in fact entire realms of game dev systems dedicated to enable that experience!
A music player that is able to change the music dynamically is neat in itself, but to me the true story behind systems like these is the tools and processes used to create the content for them. Making a technical system approachable to a creative mindset is at least as much of a challenge as the system itself.
iMUSE was used for some really beautiful music in its time, so LucasArts had this figured out. But I'd be curious to learn how they did it.
To my understanding: iMUSE had the benefit of being something of the peak of the Programmer-Musician. The lead developers on iMUSE were also the musicians in one way or another. In the 80s/early 90s a lot of music was added to games by directly programming it into the game ROM, there was a great need at that time for programmers who were also musically literate to design the music in games. Several of the LucasArts composers directly straddled this line of entering the industry at the time when a lot of music was hardcoded into software as source code and watching the industry shift around them to being able to bring in "real" musicians without programming backgrounds. Some of them retired or pseudo-retired in that shift, others fought to be seen as "real" musicians and had great careers even after the shift. (They were all real musicians to me, as a player and fan.)
Today we're starting to see the growth of the Musician-Programmer. Modern DAWs and middleware like FMOD and Wwise allow a lot of the dynamicism and layering for the musicians willing to learn it, dabble in just enough technical knowledge to be dangerous (as a good thing).
There's probably always going to be an interesting Venn diagram intersection of Musician and Programmer. They are related mathematically and historically. Most of the earliest versions of punch cards were for looms (weaving), then for music, with computers being the third application. The tools and processes change over time, but the relationships remain.
But not the remastered version. It uses live orchestral recordings and attempts to simulate the dynamic transitions of the original iMUSE system. However, the implementation is not as intricate as in the original. It has fewer transition points and less complexity compared to the original system.
I was obsessed with the idea of music production as an engine within a game a long time ago. It was just something I came across in passing when I read about how Elder Scrolls Online created a soundtrack in a similar manner. This resurfaced in my mind again when I started digging into Suno and other AI-generated music recently and it's kind of fun to wonder what'll be possible with storytelling in games and visual novels with the ability to limitlessly adapt and change based on player interactions.
If I remember correctly, another game with a similar music system is Deus Ex from 2000. It is pretty approachable. If you own a copy, open any of the s3m music files in your favorite mod tracker editor. Each song file contains multiple versions of song sequences, depending on the mood (idle, battle, ...).
I remember reading a PC Magazine article about Rogue Squadron for the N64. Apparently it was one of the first games to feature a context specific soundtrack.
First one I remember it in was X-Wing (1993), five years before Rogue Squadron. Looks like Monkey Island 2 (1991) was the first to use the system. Dark Forces used it, too.
Meanwhile, Banjo-Kazooie implemented its form of dynamic music in a very simple way, extra channels that can be faded in and out. The "extra layers" method where you turn them on and off is simple, so games like to use it.
I remember there being legal threads from LucasArts every few years, presumably when new staff at legal got aware of ScummVM. The ScummVM team got used to it but iMUSE was the only part that could actually get them into trouble because it was still patented back in the early days.
Nice to see that Nick Porcino is acknowledging the unofficial implementations in the references, including ScummVM and its now-defunct sister project ResidualVM.
The team working on ScummVM has provided a great historical resource to all of us. I have no exposure at all to interactions with ScummVM you mention, so won't speculate on that.
Hopefully the patent served more in creating a safe space for musical proceduralism in games rather than being chilling. As mentioned in this thread there has been brilliant proceduralism in so many games since Monkey 2.
I figured it was high time to highlight the innovations in iMuse because I realized in discussions that the core principles weren't well known, and difficult to extract from public sources like the patent unless you are really fluent with that language.
ResidualVM was merged backed into ScummVM as ScummVM picked up other 2.5D/3D engines beyond GRIME. Defunct makes it sound like it failed, but it succeeded well enough for ScummVM to own it more directly internally rather than as a sort of fork.
The iMuse system really is remarkable. Games like X-Wing took great advantage of the features, when a Star Destroyer jumps into the game the music would seamlessly transition to the imperial March and it felt just like being in the movies. I don't think any modern system even tries to do those seamless transitions from one music piece to another.
One thing I wonder about .. he mentions CD-audio (Redbook?) as being one capability of the system. But the CD-Audio games like X-Wing vs Tie Fighter were much more limited in that sense. You'd literally just hear the music switch to the new track. And the Force Unleashed, the last game that used iMuse, wasn't particularly remarkable if memory serves. I wonder if that was a limitation they just couldn't quite make as seamless?
I figure today you could do it, with a "virtual MIDI" system using MP3 audio of individual instrument sounds ..
Edited to add: that last sentence is essential what a DAW provides.
Also done very well in Monkey Island 2 with iMuse as well, in which a lot of care was taken in transitioning music with custom bridges. It was quite subtle and lovely, and is considered a high point in video game music.
1. A video demo: https://www.youtube.com/watch?v=7N41TEcjcvM
2. Some details: https://mixnmojo.com/features/sitefeatures/LucasArts-Secret-...
Games today feature dynamic music with loops and transitions and individual stems that can be remixed at runtime. One prominent example (to me, at least) is "Take Control" playing over the Ashtray Maze in Control. This sounds like an absolutely seamless prog metal song while playing, but it is actually highly reactive to the gameplay - the rapid-fire sequence of battle arenas and fast paced corridors. The player stays in absolute control of the pacing the whole time.
Similar with Herald of Darkness in Alan Wake 2 "We Sing" level, the song loops through bridges based on how long you take to play through it.
And that's only the most obvious examples - games like Deus Ex and others have featured dynamic music transitions decades ago.
Tetris Effect is also a great example of this. Each movement and rotation of pieces impacts the score and each level has varied genres. One of my favorites is the New York City Jazz level.
Hi-Fi Rush did some of the opposite: the gameplay in certain parts shrinks or stretches so it takes the right amount of time to hit the next musical cue.
Ashtray Maze is a masterpiece and music is core to its experience indeed.
Nier Automata comes to mind of an example, has many versions (musically and lyrically) of the same pieces for each area and transitions between them.
Final Fantasy XIV does this a lot. Boss fights' music will often change depending on what phase of the fight you're in, and in some the music will gradually transition to the heroic themes at the right moment.
Take Control is amazing.
Much like the Need for Speed series (I believe it was introduced in 1998, in the third installment called Hot Pursuit).
> I wonder if that was a limitation they just couldn't quite make as seamless?
It's a fundamental limitation of CD audio. There isn't enough buffering to keep playing sound while the laser seeks to the next track, so there must be a gap. The gap isn't even predictable, the seek time will vary from drive to drive and even vary on the same drive.
With CD audio, your CD-ROM drive actually switches mode to become a regular CD player. The digital samples don't get sent to your sound card, the drive actually has all the electronics required to decode the digital audio and convert it to analog. All your sound card does is mix the analog output from your CD ROM drive with everything else.
The game can only really send "skip to track" style commands to the drive, more or less the same set of commands you could send with a proper CD player's remote.
CD and other formats create trade-offs vs MIDI event sequences - it's a simple playback method offering a lot of fidelity but in exchange, you're tied to having either "one track at a time and the CD spins up in between" (Redbook CD), cueing uncompressed sampled tracks(feasible but memory intensive) or cueing one or more lossy-compressed streams(which added performance or hardware-specific considerations at the time, and in many formats, also limits your ability to seek to a particular point during playback or do fine-grained alterations with DSP). So as a dynamic music system it tends to lend itself to brief "stings" like the Half-Life 1 soundtrack, or simple explore/combat loops that crossfade or overlay on each other. Tempo and key changes have been off the table, at least up until recently(and even then, it really impacts sound quality). DJ software offers the best examples of what can be done when combining prerecorded material live and there are some characteristic things about how DJs perform transitions and mashups which are musically compelling but won't work everywhere for all material.
MIDI isn't really that much better, though - it's a compatibility-centric protocol, so it doesn't get at the heart of the issue with dynamic audio of "how do I coordinate this". All it is responsible for is an abstract "channel, patch number, event" system, leaving the details involved in coordinating multiple MIDI sequences and triggering appropriate sounds to be worked out in implementation. An implementation that does everything a DAW does with MIDI sequences has to also implement all the DSP effects and configuration surfaces, which is out of scope for most projects, although FMOD does enable something close to that.
I think the best approach for exploring dynamic and interactive right now is really to make use of systems that allow for live coding - Pure Data, Supercollider, etc. These untangle the principal assumptions of "either audio tracks or event sequences" and allow choice, making it more straightforward to coordinate everything centrally, do some synthesis or processing, some sequencing, adopt novel methods of notation. The downside is that these are big runtimes with a lot of deployment footprint, so they aren't something that people just drop into game engines.
> I figure today you could do it, with a "virtual MIDI" system using MP3 audio of individual instrument sounds ..
Reinventing tracker music, in other words? =D
X-Wing vs Tie Fighter: available memory and CD seek performance really restricted what you could do then.
The Force Unleashed: this is one of those "succeeds if it's invisible" things. The music is procedural based on mixing rhythmic and arrhythmic stems. That allowed continuous cross fades without needing to precisely match beats. That was a limitation again of not being able to precisely line up stems. The other fun thing that was introduced was physics driven synthesis. The DMM system fed information about strain, impacts, and other events into a granular synthesizer. The bussing and ducking architecture was derived from this paper by Walter Murch: https://transom.org/wp-content/uploads/2005/04/200504.review... Fun anecdote: I was at a party with some audio nerds, and raving about the paper to a new acquaintance, who interrupted me and said, "Oh, I wrote that!"
https://www.youtube.com/watch?v=_MguVQ1Fja8&t=8 - an example of the X-Wing iMUSE soundtrack! You can hear it swap in memorable tags, literally without missing a beat.
From a musical theater composition perspective, it's almost like building around vamp sections: https://romanbenedict.com/vamp-safety-repeat/ - you build a neutral, repeatable motif that you can easily lay under unpredictably timed segments (e.g. spoken dialogue), that's primed to "explode" into a memorable melody whenever the on-stage timing calls for it!
X-Wing just had great music. Even the original stuff was great. The music for the training run was perfect.
Modern games have similar reactive music systems but I've never heard one I felt was better than X-Wing's. They got it right on the first try.
What made these games different was that the musical themes were significant and well known long before you installed your SoundBlaster. The music was mixed at high intensity out of the box allowing it to influence you, each track tailored to the moment.
This gave the series a leg up in that the music could actually communicate information effectively -- a tense moment, the shifting tide of the battle, the calm after a victory -- whereas other games simply had to put up waveforms that sounded pleasing.
To be fair many games experimented with sound design in this era, but few had such legendary IP to build with. An unfair advantage to say the least. The folks wielding iMUSE clearly knew what they had.
Again, the original stuff was great too, I don't think it was just the familiar themes.
Dynamic music systems are standard in modern game development: https://www.fmod.com/studio
(Whether or not the game actually does anything interesting with them is its own question.)
I might deceive myself, but as I recall, I was vary satisfied with how world of warcraft handled environmental music changes. imuse may be a whole other level though.
You have awakened some incredible memories. I know exactly what you are talking about.
> I don't think any modern system even tries to do those seamless transitions from one music piece to another.
You will be pleased to hear that plenty of games since then have continued to use that same technique, and there are in fact entire realms of game dev systems dedicated to enable that experience!
> I don't think any modern system even tries to do those seamless transitions from one music piece to another.
Games definitely do this.
A music player that is able to change the music dynamically is neat in itself, but to me the true story behind systems like these is the tools and processes used to create the content for them. Making a technical system approachable to a creative mindset is at least as much of a challenge as the system itself.
iMUSE was used for some really beautiful music in its time, so LucasArts had this figured out. But I'd be curious to learn how they did it.
To my understanding: iMUSE had the benefit of being something of the peak of the Programmer-Musician. The lead developers on iMUSE were also the musicians in one way or another. In the 80s/early 90s a lot of music was added to games by directly programming it into the game ROM, there was a great need at that time for programmers who were also musically literate to design the music in games. Several of the LucasArts composers directly straddled this line of entering the industry at the time when a lot of music was hardcoded into software as source code and watching the industry shift around them to being able to bring in "real" musicians without programming backgrounds. Some of them retired or pseudo-retired in that shift, others fought to be seen as "real" musicians and had great careers even after the shift. (They were all real musicians to me, as a player and fan.)
Today we're starting to see the growth of the Musician-Programmer. Modern DAWs and middleware like FMOD and Wwise allow a lot of the dynamicism and layering for the musicians willing to learn it, dabble in just enough technical knowledge to be dangerous (as a good thing).
There's probably always going to be an interesting Venn diagram intersection of Musician and Programmer. They are related mathematically and historically. Most of the earliest versions of punch cards were for looms (weaving), then for music, with computers being the third application. The tools and processes change over time, but the relationships remain.
I worked with Nick back in the ILM R&D group. He's an incredibly kind man and one of the best developers I've ever met; truly a genius.
The music immersion in Monkey Island 2 thanks to iMUSE was extraordinary.
You must play MM1 and then MM2 to truly appreciate the difference.
But not the remastered version. It uses live orchestral recordings and attempts to simulate the dynamic transitions of the original iMUSE system. However, the implementation is not as intricate as in the original. It has fewer transition points and less complexity compared to the original system.
Direct comparison: https://www.youtube.com/watch?v=JkMoHEFtnLQ
I was obsessed with the idea of music production as an engine within a game a long time ago. It was just something I came across in passing when I read about how Elder Scrolls Online created a soundtrack in a similar manner. This resurfaced in my mind again when I started digging into Suno and other AI-generated music recently and it's kind of fun to wonder what'll be possible with storytelling in games and visual novels with the ability to limitlessly adapt and change based on player interactions.
If I remember correctly, another game with a similar music system is Deus Ex from 2000. It is pretty approachable. If you own a copy, open any of the s3m music files in your favorite mod tracker editor. Each song file contains multiple versions of song sequences, depending on the mood (idle, battle, ...).
I remember reading a PC Magazine article about Rogue Squadron for the N64. Apparently it was one of the first games to feature a context specific soundtrack.
First one I remember it in was X-Wing (1993), five years before Rogue Squadron. Looks like Monkey Island 2 (1991) was the first to use the system. Dark Forces used it, too.
Meanwhile, Banjo-Kazooie implemented its form of dynamic music in a very simple way, extra channels that can be faded in and out. The "extra layers" method where you turn them on and off is simple, so games like to use it.
I remember there being legal threads from LucasArts every few years, presumably when new staff at legal got aware of ScummVM. The ScummVM team got used to it but iMUSE was the only part that could actually get them into trouble because it was still patented back in the early days. Nice to see that Nick Porcino is acknowledging the unofficial implementations in the references, including ScummVM and its now-defunct sister project ResidualVM.
Patent is from 1994, and would have expired in 2014.
USPTO says 2011 on their site.
The team working on ScummVM has provided a great historical resource to all of us. I have no exposure at all to interactions with ScummVM you mention, so won't speculate on that.
Hopefully the patent served more in creating a safe space for musical proceduralism in games rather than being chilling. As mentioned in this thread there has been brilliant proceduralism in so many games since Monkey 2.
I figured it was high time to highlight the innovations in iMuse because I realized in discussions that the core principles weren't well known, and difficult to extract from public sources like the patent unless you are really fluent with that language.
ResidualVM was merged backed into ScummVM as ScummVM picked up other 2.5D/3D engines beyond GRIME. Defunct makes it sound like it failed, but it succeeded well enough for ScummVM to own it more directly internally rather than as a sort of fork.
[dead]