Wizarding World of VR
Published a year ago
Fantastic Beasts Brings Together Realtime and Pre-rendered


Who We Are

Framestore is probably best known for crafting the VFX for major Hollywood movies, such as the Harry Potter series and Gravity. As such, I feel it’s probably worth giving a brief overview of our VR work for context.
We have been VR converts for some time. We created our first commercial VR experience back in early 2014 with HBO and have had a team dedicated to VR ever since. We’ve worked on installations such as this virtual hike for Merrell (roomscale VR with a DK2), mobile VR with brands like Marvel, and a unique, multi award-winning bus trip to Mars.

The Wizardy of Daydreams

Last year, Framestore was finishing off the VFX for Fantastic Beasts and Where To Find Them, which proved a thrilling opportunity to revisit the Harry Potter world. Our team wanted to use VR to immerse people into the world of the movie - there were so many beautiful sets and creatures involved, it felt like a nice little companion to the film. With Google launching Daydream the month prior to the film première, it seemed the perfect match.

Pre-production Problems, and How to Solve Them

From the start this project was going to be tight. We had eight weeks to complete an in-store demo, and a further 4 weeks to expand out to full release.

How do we render the beasts on a mobile phone?

Mobile devices can be pretty good at rendering realtime 3D graphics nowadays, but film VFX shots have also become rather complicated:
The beasts in our VR experience had to look like they came from the same world, yet in the film each frame takes multiple hours to render. We felt getting this same look in mere milliseconds was too much of an ask, even for the best devices. Instead we opted to pre-render our creatures using the same assets as the film, and use the resulting video inside Unity.
We animated in much the same way as we would for realtime animation - however, the idea was to flip these pre-rendered clips seamlessly to create the impression of continuous movement.
This is not a typical use case for video players, so we built an optimised NDK video player for Android as a Unity plugin. The player itself also contained an animation state machine, so that at a C# level we could deal with actions and events, whilst at a C++ level the plugin would handle scheduling animations and idles.
These videos were then overlaid over a stereo 360° cubemap. We wrote scripts to place these cards at the correct position with respect to the background plate. Layering was done via render order (See here for more info.)
It’s a good trick to learn that you can do "Queue" = "Transparent+1" +2, +3 etc. to establish your layers in a consistent way, much as you might do with Photoshop layers. This is a neat trick for those ‘gotcha’ moments when you realise that you need an extra mask layer in your shot - and very quick to duplicate and test.

A Wizard Needs a Wand

We had our beasts in the world at this point. Now all that was left was to place the user in the shoes of a wizard. Google Daydream comes complete with a controller, and it felt a little disappointing to use it just for selecting videos. A wizard needs a wand, and items to interact with.
For this experience we took inspiration from the early Resident Evil games, overlaying realtime rendered objects over a pre-rendered backdrop. We didn’t know if it would work in VR, and we didn’t know if it would work on this extreme (360°) wide angle prerendered shot. Turns out, it worked just fine. (As a side note, the starting transition with Newt’s case and the black background is a nod to the original’s door animation.)
To do this we built a barebones collision-like geo of the offline rendered scene, and used that for placement of realtime objects within the world. This was essential as, without it, it would have been pretty much impossible to eyeball the depth accurately. There are some issues with this approach: most notably that with 360° renders the stereo is correct only at the center of the image, whereas realtime rendered images are correct across the render.
In addition we used the shader queue trick, as above, to add contact shadows and occluding geometry, to better sell in the placement of these items within the scene.

Of course it’s happening inside your head, but why on earth should that mean it’s not real?

Our aim for this project was to produce something which would bring people into Newt’s world, and to hopefully live up to the excellent work completed by Framestore’s Film VFX team. We’ve had favourable reviews, and were delighted to see both the film and VR experience receive two Visual Effects Society nominations apiece; I think we successfully managed to bring Beasts to life in VR.
Meagan Phillips
Marketing Director - Marketer