Notifications
Article
How Nissan Mexico is using virtual, augmented and mixed reality experiences to engage people at Fanzone
Updated 6 months ago
76
0
Fanzone is a space organized by Nissan Mexico where converge technology, innovation and emotion. As UEFA Champions League sponsor, Nissan wants to bring new experiences to engage soccer fans.

Challenge

The challenge was create experiences where converge soccer and technology, to engage people who likes one or both disciplines. Specifically create three different experiences using augmented, virtual and mixed reality:
  • Virtual Reality Experience. A virtual environment where you become a professional football player and practice three different tricks: throw a training cone with the ball, kick the ball and pass throw a Xtrail sunroof and score goal with the head from an air pass.
  • Augmented Reality Experience. A mini soccer game with twisted rules. The soccer players are the Nissan superstar car, the GTR. Four players have to play at the same time, this means four cars, four goals and one ball. Virtual elements need to fit on a real mini soccer field.
  • Mixed Reality Experience. A cooperative experience, based on soccer and integrating mixed reality that brings a real life arcade experience, both players have to take part of the game: dribble a soccer ball, dodge real obstacles, collect virtual items, dodge virtual obstacles and score a real goal. It has to show the spectators what happens within the game.

Results

We used Unity to develop all the experiences, it’s easy to integrate differents SDK’s, tools and custom hardware. All models were made for PBR and texturized with Substance painter, our 3d art team obtain great results in short time.

Virtual Reality Experience

We decide to use Oculus Rift and Neuron motion capture system, because Oculus setup is more flexible than others HMD and Neuron have the best cost-benefit relation in the market, setup is so easy and its sensor synchronization system based on wi-fi, making it the best choice.
When using both, Oculus HMD and Neuron mocap system, We encounter a little problem, each one wants to control head tracking, so we had to do a little hack to overwrite mocap head tracking system by Oculus HMD, We adjusted constraints on skeleton used by mocap system and applied HMD head tracking to head bone orientation.

Augmented Reality Experience

AR. We used Vuforia AR sdk, because we have experience using it and cover project needs. The soccer field was our image target, its length was 2 meters approx., That reduced player's vision, it was only a field portion. We deal with that using extended tracking and strict control over ambient lighting.
Networking. We needed a network solution with WLAN support, without internet access compatible with Android platform, after searching for viable solution with this restrictions, We chose Photon On-Premise Server, it meets our needs and offers an stable and easy to implement network solution. Merge a network solution with an augmented reality experience was a big challenge. Synchronize physics, positions and particle effects while tracking an image target, required a lot of testing and rework on gameplay and mechanics.
Mixed Reality Experience
Mixed Reality Solution. We used two hololens, microcontroller boards, a PC unity client and a Photon local server, all connected to WLAN and sending messages to each device to synchronize game session, feedback events and live streaming. We needed to create a editor scene, in order to set up the game scene. It places and manipulates holograms to fit those on physical space, for example, we need to match a particle effect over a training cone position, to emphasize a real obstacle. One important thing was the world anchor setup in editor scene,it manage a world reference to keep in place all virtual elements between game sessions.


Networking. Despite lack of documentation integrating photon in HoloLens, We wanted to use it, because its trusted functionality in AR project. It worked so well, we was able to synchronize game object's transform and events between hololens and external clients running on PC.


Hardware Integration. We have to show messages on a LED display, animating LED strips and lights, in order to bring a real life arcade experience and feedback to the audience. We run a windows app with photon client, to receive event messages and show these on LED screen, it was made with Unity with the same RPC list that Hololens apps. For LED strips and light animations we used Arduinos Yún, it provides communication via web services, so we used WWW objectb from HoloLens app to launch animations on Arduinos. All devices was connected to the same LAN network.
Live Preview Solution. To enforce feedback, we need to show a live streaming from both HoloLens to tv screen. In the begining we had no idea how to do that, the only clue was the live preview on Windows Device Portal for HoloLens. Our final solution was based on that, it was an html script, which streams video from each device with its IP’s.
At the end it was a big success, all experiences offered new ways of interaction between a popular activity (soccer) and technology, merging fan's passions. From childs to olders, everybody enjoy all the experiences and showing that there is no age to experiment new technologies.

Wasd Studio
2
Contributors
Michael Velasco
CTO - Programmer
Fabian Hernandez
3D Technical Artist - Artist
Comments