In 2016 I was approached by SwordsGames to design, create and implement all audio for one of their upcoming projects. Unfortunately the project is still underway and I can't reveal too much specific information on the game itself. However, I can explain how that experience changed the way I think and feel about music.
In my musical career up to this point I had played in many different bands, and played a countless amount of gigs. I was exposed to a myriad of styles, cultures and cliques, all of which helped me grow as a person and as a musician. I had recorded in studio's and releases E.P's, 'depped' on albums and got my name out as a session player.
When I agreed to try my hand at production for video games I was excited at the opportunity to expand my knowledge once again and to see 'behind the veil' of the music industry. The first thing that surprised me about video game music is how much my perspective changed when implementing audio into Unity.
For those that don't know, in a standard audio editing software (DAW -Digital Audio Workshop) the layout is very linear. The track goes from left to right, following the time markers up the top. Makes sense right? You start recording, the screen scrolls left as you play and as you record your track appears from the left and finishes on the right. Simple.
Then I turned on the Unity Engine and began to implement my audio. All of a sudden I find myself in a 3D environment. Nothing is linear anymore and I have a lot more to consider than I ever had before in a DAW. Now I have to learn all about assets and roll-off's and reverb zones. There were issues that arised that I had never thought I would encounter. I thought I knew everything I could know about reverbs, after recording in studio's for years, but what do I do if the reverb from one room is bleeding into another? How do I stop the sound of the enemy coming through the walls when you're not supposed to be able to hear them yet?
Everyday I was faced with new challenges that just don't arise in a live or studio setup. The only thing for me to do was to start from the bottom and learn, once again, how to mix and edit and arrange in this new context. I had to change my perspective on mixing and think of my audio assets as interactive, living elements.
Then, I could begin to have some fun and really utilise the potential of game audio and what can be done.
Audio has such a huge impact on immersion in a game. It can help set the atmosphere, cue events and actions or it can even give information that can't been seen or interacted with yet (e.g you can't see the dragon, but you can hear him about to breathe his fire!). Audio takes you a further into a game than you expect to go and can convey abstract concepts that are just not possible in the real world.
This is why I have fallen in love with the Unity Engine, and game audio design in general. It has kept me on my toes for two years and given me exciting challenges which force me to go back and do my research. It proved to me that the more you know, the more you realise how little you know!
Now I am starting my first project of commercial audio asset production and I am very excited to be releasing my first asset pack very soon! I welcome any feedback and comments and I hope that the community here can help me shape my products into something that will be useful and helpful to other on their journey into the crazy yet fulfilling world of game design!
Listen to the demo here: https://soundcloud.com/user-582153983/epic-electro-action-pack
Thank you for reading!