A next generation AR sandbox game with hand gesture control.
AR Sandbox Game: Wonderful World
Inspired by the classic AR Sandbox installation by UC Davis we developed a next-generation AR game with hand gesture control. We bring haptical augmented reality gameplay to the next level: Ultra fast interaction feedback, intuitive controls and great visuals make our sandbox game a magical experience. Capture the floating nature fragments with your hands and slowly build up your own landscape.
The game was developed in a four month project phase during our studies at Filmakademie Baden-Württemberg.
I use a Mircosoft Kinect 2 to get the height information of the sand and the hands. But: The raw Kinect 2 depth image is completely unusable for this usecase. It is noisy as hell, whenever there is a "hard" border the depth at that border become zero, etc etc. So my first step was to create some sort of filtering and smoothing of the Kinect image. I created a small GIF to show you the effect.
Usually when you move your hand above the sand, this height information will also be projected as really big mountains onto the sand - but I didn't wanted that, so I thought about a way to filter out the hands. I found a really cheap way of doing it- the basic explanation: I just remove all depth information coming in after the setup over a certain height threshold. Once the hand is 3cm above the sand, it gets recognized as terrain again (But there is more then one threshold for fast interactions, etc.) This results in a really magical interaction - in fact, it`s the one thing most people ask me how it is possible. The thresholded data is then being processed with OpenCV to detect certain hand shapes for the user input.
Because this whole setup would be super slow I cut the Kinect image in 8 overlapping pieces which I assign to different threads. Threading (and especially multi core threading) is by far the most complicated stuff to do (in Unity). But it is also very important for a game like this - as we use the calculated positions of the players hand shadows to capture objects you don`t want these to lag behind the shadow. Also the digging part of the game feels very natural und satisfying because of the really fast interaction speed.
The terrain is created from one displacement map generated out of the filtered depth info. The rest happens in a shader: I generate the normals out of the displacement by calculating a vector from a rectangle of surrounding height points - the normal map is used for correct light and shadow of the terrain. By taking the center point of that rectangle and comparing it with the surrounding height info I can calculate a curvature map to get nice stony mountaintops and green valleys. I also use the normals to calculate the slope of the terrain to detect cliffs and to mask out the forests. In the gallery above is an image describing these effects.
The landscape shader is really really big - there is also some fake erosion stuff going on which looks pretty awesome, and I use a lot of different noises to create the sand dunes, snowy mountains, forest colors, ocean waves, ... But thats just the nature landscape! There is also the blue looking, glitchy world with the same setup behind it.
The game is about capturing & connecting pieces of nature - the connecting interaction needed a nice visual feedback as the players should feel some kind of "success". We decided to use a voronoi algorithm for this effect - every nature object can morph itself to fit inside the surrounding voronoi positions (via a shader), which causes a really cool, snappy visual feedback for connecting - and it also creates this hexagonal look everyone loves.
Where to play?
The game will be available to play at FMX Conference 2018 (April 24 - April 27) in Stuttgart, Germany. We are currently looking into other conferences & museums to exhibit the game. If you have any suggestions, please contact me.