We’ve worked on music games for quite a while now. Dan and I were part of the team that made the DJ Hero games and I also headed up the team creating most of the Guitar Hero downloadable content for Neversoft.
We left FreeStyleGames/Activision in 2013 to start Mad Fellows. We put out a mobile game called SineWave that used our own engine. We then decided to make a multi-platform console game. With such a tiny team it would surely take miracle… or Unity!
As Aaero is our first Unity game, we first set out to spend some time getting used to the work flow. Things came together very quickly. Within five days we had the basic prototype together that was definite proof of concept that we were onto something cool.
So how does it work? Y’know, like with the music and stuff?
I’m going to keep this fairly high level because it’s very complicated and… *sigh*… honestly, I’m the creative side so I basically design the tools I need, Dan makes them and then I use them. As far as the actual nuts and bolts of how it all works, you’d need to talk to Dan. I’ll get him to answer any questions you may have if that helps.
Thinking in beats and bars.
The first thing that’s pivotal to how our game works is that it ‘thinks’ in terms of beats and bars. Anything that would usually be expressed in seconds/milliseconds instead has musical timing.
The main reason for this is allowing prefabs to be compatible with tracks of different tempo without modification.
For example, if an enemy animation has a duration of 2 seconds it will loop perfectly in time with music at 120 beats per minute. However, if you then use that same animation in another track, it will be down to pure luck if it still moves in time.
The timeline above is grabbed from a music sequencer. Notice that when we change the BPM, the bar time stays constant and the seconds/milliseconds update to compensate.
In Aaero we set everything to a bar-time duration. Then, if it’s set to be 2 bars long, that’ll scale depending on the BPM of the track and will always be perfectly in time.
Here’s a handy online tool for doing quick conversions between beats and milliseconds:
This conversion is essentially what our tools do behind the scenes.
Building a track
When we’re building a track for a new song, we start with the spline that the gameplay takes place along. We have created a tool we call the ‘route editor’ (I decided we needed this after I’d played Scream Ride one night). The route editor has now become a fairly robust and fully featured rollercoaster creator. As these splines form the foundations of the tracks, the tool was given much consideration.
Each segment of track represents a beat, each 4 represent a bar. In the image above you can see that the currently selected segment covers bar 67, between beats 1 and 2. (67.1.000 to 67.2.000).
In order to be able to reuse our environment assets between different songs, assets are built to a particular size. For example, a tunnel section is usually 1 beat long. However, unlike the animations, instead of scaling the asset to meet the bar time, we control the speed the ship travels along the spline to make sure each element passes the camera at the correct beat.
Put simply, regardless of the tempo of the music, the ship will adopt a speed to ensure it takes exactly one beat to traverse each tunnel section.
MIDI, as far as I’m concerned, is the main ingredient that makes developing a music game with such a small team possible. For anyone that isn’t familiar with MIDI, it stands for… um… Music… er… *let me google this*…
MIDI obviously stands for ‘Musical Instrument Digital Interface’. It’s essentially just a set of instructions that is fed into digital instruments.
You know those self-playing pianos that have a reel of paper with holes in? MIDI is that but digital.
(Picture courtesy of mechanicalmusicrestoration.com …like they’re going to check!)
So, a MIDI file is literally just an event list of which notes to play at specific times. It’s usually visualised in a MIDI editor or sequencing package as a ‘piano roll’… wait… that’s because of that paper thing up there! Aaaaah, it all makes sense now.
If you’re a MIDI expert already and this is boring you, see if you can work out what song that is.
We feed our tools these MIDI files and it uses them to place almost every element of Aaero. Each environment model, each enemy and pretty much everything in between is positioned by a MIDI event.
The ribbons of light that you follow in Aaero (coming soon to Xbox One, PS4 and Steam, you should buy it and tell all your friends) are all positioned by MIDI.
An example of ribbons in Aaero (second half of video).
When using MIDI, gameplay is designed and created alongside the music in sequencer software such as Reaper.
The image above shows the view used for creating the ribbons. We use the velocity envelope to control the direction between clockwise and counter-clockwise.
When playing Aaero, if you don’t follow the ribbons, the specific audio tied to the ribbons drops out. It’s usually the main lead synth but sometimes we use vocal tracks too.
In order to do this, we needed to source multitrack recordings (or ‘stems’) from the artists and record labels.
Fortunately for us, the artists and labels we work with are super cool and helped out loads. This isn’t always the case. You’ll find that most multitrack recording you are sent will need mastering before they sound like the released version. Also, understandably record labels and artists consider these assets sensitive. They’re not meant for listening to individually and can be used to create all sorts of horrible mashups and bootlegs that get posted to YouTube and ruin everyone’s day.
So this post has been fairly random and rambling. I’ve flitted between different bits and pieces without going into much detail. Hopefully it still gives a bit of an insight into how we do things. We’re always happy to hear from other developers so if you have any questions or want some more detail into specific areas, do give us a shout!