Level Up: The Evolution of Video Game Audio

In 1958, nuclear physicist William A. Higinbotham created a crude electronic tennis game for visitors to Brookhaven National Laboratory. “Tennis for Two” consisted of an analog computer, two controllers and an oscilloscope screen. It is widely considered to be the first video game.

Thirteen years later, Nolan Bushnell and Ted Dabney created the first video game with sound, “Computer Space”. The year after that, the height of game audio was the naked blip of a paddled ball in the arcade smash “PONG”.

Today, even Sir Paul McCartney can hum the Super Mario Brothers theme, which he once did for its creator, Koji Kondo, backstage at a concert. Sold out performances by game composer Nobuo Uematsu with the Los Angeles Philharmonic have lead to near-riots. Beck has included an 8-bit intro to his charting pop tune “Girl”, while Trent Reznor stepped away from Nine Inch Nails to add music to games like “Quake” and “Call of Duty: Black Ops II”. There is even “Chiptune”, a rich sub-genre of electronic music created entirely using the circuitry of our favorite handheld and home consoles of the 80s. Once an afterthought, game audio now lives a second life outside of pixilated screens.

Then and Now

“You’ve got to have sound,” Nolan Bushnell told his young employee Al Alcorn, while Alcorn worked on an exercise in game-building that would become Pong (Bushnell would later become chairman of Atari). “I want the roar of a crowd of thousands”, Alcorn recalls Bushnell saying, in an interview for IGN.

“How do you do that with digital circuits? Ones and zeroes? I had no idea, so I went in there that afternoon and in less than an hour poked around and found different tones that already existed in the sync generator, and gated them out and it took half a chip to do that. And I said ‘there’s the sound – if you don’t like it you do it!’”

Journey Video Game

The “Journey” arcade game was based on the band’s music and exploits. When players achieved a certain level, a cassette player inside the machine played an analog tape of the band’s song “Separate Ways”. (Image from flickr user Greg Dunlap)

From these humble beginnings, game audio blipped and farted along for several more years, using complex processes to make primitive sounds. The programmable Atari Video Game Console (VCS) of 1977 carried an 8-bit Motorola 6507 CPU and a single “Stella” chip for graphics and sound. The Stella chip allowed programmers to control sound output across three different variables: type of sound, frequency, and volume. Unfortunately, these controls weren’t sophisticated enough to easily mimic the tunings of traditional Western instruments. Early attempts at melody played like a tone-deaf robot trying to hum its favorite folk song..

sponsored


In the mid-80s, Nintendo dominated the home console market, ushering in a golden age of game themes. The NES offered composers 5 channels to work with: two square wave channels, one triangle wave channel, a noise channel (for percussive sounds), and a digital channel that could play crude samples (think of any voice you’ve heard in a Nintendo game). All music needed to be converted to basic text for programming purposes, so implementation was still a complex, and often an unmusical process, as game composer Neil Baldwin (Magician, James Bond Jr.) writes on his website.

“There were no tools to speak of so everything was entered as numbers in the [6502] assembler/editor. I worked out tunes on a little Yamaha keyboard and typed in the pitches and durations. Often I’d work out timings on some squared graph paper, mostly by trial and error.”

Famed game composer George “Fat Man” Sanger (Wing Commander) saw early game audio’s limitations as a challenge, but never a hindrance.

“The feeling that I might have been ‘limited’ in some way translated in my mind immediately into a positive: ‘OK, I’m writing for a new medium that has these requirements and these superpowers. What will I do with it?’ I just couldn’t imagine Bach being bugged by having to write for ‘just a string quartet.’ My players (the oscillators that were available to me) might have a tone less thrilling than Bach’s target platform, but my guys would never miss a note, and could play as fast as I wanted them to, make huge leaps–all kinds of things.”

Composer, educator, and co-author of the book “The Essential Guide To Game Audio”, Steve Horowitz, is old enough to remember “The Fat Man” selling floppy discs full of general midi sounds to aspiring composers for a buck a piece. By the time Horowitz began composing for games in the early 90s, however, the CD-Rom had introduced a sea change within the industry.

“[In] a cartridge game the whole thing had to be 6 megabytes max, including the music. Suddenly you jump into CD-Rom and you have 500 megs for a game… Those were really exciting times back in the early 90s because—as a composer—all bets were off. One day you’re working on a heavy metal track, the next day you were doing an orchestral track, then you’re in the studio for Sega. It was really the wild wild west of games.”

With the floodgates of game audio now wide open, composers like Nobuo Uematsu blurred the lines between “traditional” music and game music with rich, orchestral scores. Uematsu composed the scores for the first 9 Final Fantasy games and has seen his music performed on sold out tours by orchestras in Japan and North America.

sponsored


With the palette of game music now equal in scope to that of popular music, composers must look at their approach as a key way to differentiate themselves from competition.

Scoring Outside the Lines: What Makes for Good Game Music?

“The fundamental difference between composing for non-linear interactive media, and linear media like film and television, is that in a film or with TV, you get the video and you can always go right to that place where the guy gets mad at his girlfriend and he storms out and gets hit by a bus, and it’s always at one minute and twenty-seven seconds and you can just construct the perfect cue, so that when he’s squashed everybody’s really sad,” says Steve Horowitz. “But it doesn’t work that way in games. In games, you don’t know how long that person is going stand there and have that argument, you don’t know if they’re going to go out through the left door or the right door. You have to be able to compose modularly. It’s a different way of thinking about the construction of music.”

Amanda Rose Smith

Amanda Rose Smith

Composer Amanda Rose Smith, who often works with Muzzy Lane Games, echoes Horowitz’ sentiment that game composers need to be comfortable with ambiguity. The interactivity of the medium dictates that they can’t write complete pieces with concrete beginnings, middles, and endings.

“In games, sometimes it works better to think of a piece as an emotion. It has to evolve and change, but also do so in a way that when it comes back to the beginning, it doesn’t feel like a repeat so much as another variation on a theme that is present throughout the environment.”

Smith highlights the delicate balance the game composer must strike: repeat a catchy chorus in a pop song 4 times and you’re a genius, but repeat a piece of music in a game 200 times and even the most interesting melodies can turn grating.

The legendary Koji Kondo, Nintendo’s first hired composer, and a sound director there to this day, relied on his immersion in the gameplay for his ability to create indelible scores like Super Mario Brothers and The Legend of Zelda.

Kondo used BASIC (Beginner’s All-purpose Symbolic Instruction Code) to write his scores. (He later wrote a manual, geared toward families interested in programming Japanese popular music into the Nintendo’s Famicom console using BASIC.) His position within Nintendo allowed him to “start working on the sound of the game as soon as the rest of development was ramping up, so we’d be working in parallel to them.”

The Super Mario Theme we now think of as classic was rewritten several times based on Kondo’s reactions to hearing the music in the context of real-time gameplay.

“It had to fit the game the best, enhance the gameplay and make it more enjoyable. Not just sit there and be something that plays while you play the game, but is actually a part of the game. As I’d create a piece of music, I’d set it aside and start working on another one, and then notice that something didn’t fit, so I’d go back and fix it. And so all of my rewriting and recomposing was self-motivated.”

Know Your Machine

Musician Mike Pace fell into composing for games after his trajectory in the “old” music world hit a downward slope. His band, Oxford Collapse, had dissolved after 8 years and the fulfillment of a two-album contract with the Sub Pop label. Pace wanted to stay active as a songwriter and musician, but had no desire to continue dealing with “band” issues like lugging equipment around and managing musician personalities.

In the summer of 2009, Pace began teaching himself the basics of MIDI. Soon thereafter, a New York friend started an iPhone app company called Eyedip, and Pace finagled his way into the position of in-house composer.

“One game [Pocket Devil] was a hit, so we had maybe 5 or 6 more in development. I was providing all the theme songs and music for various levels. If the games needed any sound effects, I was either creating them myself or getting them from stock media sites. It was fairly consistent work for the first year and a half.”

Pace received a 3% royalty on sales of Pocket Devil. Not bad, considering that for a brief period in late 2009, the game was one of the most-downloaded iPhone apps in the world. Pace clearly saw the potential of a career in composing for games, but a visit to the Game Developer’s Conference in San Francisco—where he met some of his competition face to face—suggested that it might be difficult to realize that potential.

“It was really satisfying creatively, but in the end I couldn’t sustain it because I just couldn’t get enough work and I realized how competitive it is. I also realized that there were people who were much much better at it than I am.”

Amanda Rose Smith believes that the key to surviving the competitive marketplace of game composers is a thorough understanding of gameplay and platforms, rather than pure musicianship.

“Learn the technology,” she says. “Having composing chops is important, but if you don’t learn how to effectively communicate them, it won’t matter. Especially in lower budget games, the more skill you have, and the greater understanding you have of how your job fits into the entire development process, the better.  You bring more to the table that way.”

“You need to understand how game engines work,” Steve Horowitz adds. “You need to understand how hardware works—what the capabilities are. Walk a mile in that programmer’s shoes so you know what’s possible. The more information you have about doing your own implementation, the more employable you are. It’s not an absolute, but the market is getting tighter.”

Legendary video game composer Koji Kondo. It has been said of him that "Koji Kondo is not the Mozart of videogame music. Mozart was the Koji Kondo of classical music."

Legendary video game composer Koji Kondo. It has been said of him that “Koji Kondo is not the Mozart of videogame music. Mozart was the Koji Kondo of classical music.”

Horowitz and Smith are talking not just about gaining platform-specific knowledge, but working with and understanding the third-party “middleware” that has become widely available, and widely used, in the last few years. Programs like FMOD and wWise offer a user friendly interface that allow composers to create events and parameters for their music, see how their music might perform within a game environment, and lets them save their finished product as a text document that can be given directly to programmers for implementation in a game.

“When computers powered out to a certain point, you had people saying ‘We don’t need MIDI. Just take a symphony orchestra, cut it up into a million pieces and put it back together in the game,’” says Steve Horowitz. “Now plugin companies like iZotope and others are starting to build cool stuff to go inside of audio middleware and even consoles themselves, using MIDI once again to trigger sounds inside games. It’s all part of this cycle of what’s old is new again.”

Mike Pace couldn’t see deciphering middleware as a part of his future. He now composes for film and television, in addition to writing his own pop songs again. It’s a process he says has been informed by his experience with game audio.

“In the past, if I stumbled upon something I liked, I didn’t want to repeat it too much. Get in, do your thing, get out. Now, I’ll come up with something—a riff or piano line—and loop it, and that becomes a verse. I like working within Reason, cutting and pasting, creating a song out of a 4 or 8-bar sequence. You add things, you subtract things, and it becomes this sort of Kraut-Rock thing. Some of the video game stuff I’ve done, I’ve gone back and used it as a basis for a new pop song. I guess the way I look at it is that it was all helping me as a songwriter.”

As technology continues to improve and platforms continue to expand, Steve Horowitz believes there will be more and more opportunities for composers of all types in game audio (provided they know their stuff).

“The pipeline to how to score a game is becoming more user-friendly, and just a few years from now, we’re going to see these systems be really user-friendly. Every time we think it’s the best it can be, the platforms change. Some people are doing 100-piece orchestra titles for Xbox One games, while across town someone is sitting with a midi program trying to figure out audio for mobile.”

Today’s game composers can choose any number of doors to walk through. What music will be playing when they do?

 Blake Madden is a Musician and Author Who Lives in Seattle.

Please note: When you buy products through links on this page, we may earn an affiliate commission.

sponsored