Updated a year ago
A music auto-adaptable cinematic
Headset highly recommended!

Creation process


We are a small group of coworkers from Costa Rica, we work everyday creating augmented and virtual reality apps for mobile and VR headsets. We decided to try the Neon Challenge to explore the shortfilm world with all the Unity features created for this purpuse.
  • Jorge Ramírez - Programmer
  • Pablo Zuñiga - Digital Animator
  • Mairon Corrales - Programmer

Narrative inspiration

Our main goal was to create a short film that were based mostly on music, all the elements and environment would interact with any song we added to the system.
We started with so many ideas and ended up with a very robust story where we would have multiple characters and a post-apocalyptic world that were going to come to life with the music. That was a very ambitions idea and we couldn't finish it because we worked on this project only on free time after work. So that is why we decided to create this loop-based shortfilm with all the audio process core we had been created.
Our final idea was to create a very enigmatic humanoid robot, with a very strong presence walking in a loop street, interacting with some elements and bringing life to the city along with every step he made based on the music.
Our main reference for this project (at least for final idea) was:
Walking city loop

Scene conceptualization

Based on our final idea, we opt to choose a city since for our loop idea was perfect and matches the challenge main requirement.
This are some of our inspiration for the city creation:

Material creation

I'm going to split this part into 2 sections: Digital animation and programming
Digital Animation:
The disco lights was our main inspiration to start defining the robot design. We used Autodesk Maya to make the basic mesh and then Zbrush for the details.
Finally we used Substance Painter to make the final materials for Unity:
This is the backbone of our idea. We created an audio core where all the music were processed based on the audio spectrum. We achieved this using the Unity AudioListener spectrum functions.
All the objects that interact with the music are separate game objects that changes their behavior based on the core system data.
Optimization was a very important part because audio processing is a heavy load for hardware and we had too many objects interacting at the same time plus the rendering that was very heavy by itself. That is why we created a single core where all the objects consult and use numbers, colors, averages, light intensity, etc for its own purpose.

Production process

As we started with another idea, we began with some lightning tests using Atmospheric Scattering from the Blacksmith asset. We wanted a desolated landscape with a cold illumination, here a pair of screenshots of what we got:

We then started with some audio process testing and ended up creating the audio core explained before:

We made some rigging tests with the Unity Humanoid Rig and mocap animations with our new robot:

Then we jumped to the second idea and started testing the city and loop.
Then we started all the lightning process. We have mixed Global illumination with baked spot lights, reflection probes and realtime area lights. We also added volumetric light fog that highly contribute to the lightning process.
Then we added the music system and made some camera tests:

Post production

Finally! we added post-production with Post Processing Stack, specifically: Fog, Antialiasing, Ambient Occlusion, Bloom, Color Grading (maybe the most important here), chromatic aberration, grain and Vignette.


I would like to start talking about the Unity recorder, normally a person would render his short film at a specific amount of FPS, ending up with a very fluid video. The thing is that Unity recorder doesn't support audio output. So, we got a problem! our entire project is based on audio! If we record with 60 FPS, all the spectrum is calculated in the frame that the recorder ends with the last frame, but the song keeps going at its original velocity, so, we would have a wrong render.
For that reason, we had to render the video at a variable FPS. Sadly we doesn't got a fluid 60 FPS video, but at least a well sychronized one.

Unity Asset Store

The Blacksmith: Atmospheric Scattering
Even if we didn't used it in the end, we worked with it almost the half of our time to test our firts scences that are described in the production process.
Raw Mocap Data for Mecanim
To add animations to our robot and to verify the rigging process.
To create all the camera movements and shots.
Post Processing Stack
For all the post-production process described before.
To get our final render
Volumetric Lightning
From the ADAM short film and not officially at the asset store, we used it to add a very nice fog effect that is affected by the area lights.
We started using timeline for our first idea but we don't needed it for the final one because everything works automatically with the music and the loop makes the rest of the work.

We hope you like our project and feel free to ask whatever you want!

Software Developer - Programmer
Pablo Zúñiga Chaves
Digital Animator - Artist
Mairon Corrales
Software Engineer - Programmer