Ideas formed in the minds of science fiction writers are often predictions of the future. The internet, touch screens, the lunar landing, and atomic bombs were all forecast by sci-fi writers. For the last half century, virtual reality has been repeatedly envisioned in many ways in popular sci-fi. Neuromancer's Matrix ( William Gibson, 1984 ), Snow Crash' Metaverse ( Neal Stephenson, 1992 ) and the more recent Ready Player One's Oasis ( Ernest Cline, 2011 ) are all predictions of what the future will bring with the use of virtual reality. The future is today.
It all started on the first of August 2012, the beginning of the Oculus Rift kickstarter: critical hit in the geek nerve. The first pledges were already made from our office in Rotterdam and ideas were being thrown around like fireballs in a barn. "Somebody will make a diving game, so it better be us!" blasted Creative Director Richard Stitselaar. While Vertigo Games originally was a company balancing workload between in-house entertainment games and contract based applied gaming work, this day paved the road we are walking today.
World of Diving
We started working on our first VR project and sandbox from the day we received our DK1. World of Diving is one of the most obvious experiences everyone can imagine when thinking about virtual reality, besides, the headset already even feels like a scuba diving mask! We worked in a small team on the first version of the game and pushed an early version to the public quickly through Steam early access. Shortly after the first version was available, we got invited to come over to the Valve office in Seattle to check out something new around VR: the Vive was shown.
This challenged us; we now have to support multiple VR devices in a rapidly changing environment and to spice it up even more, motion controllers came into play. We started working on our own modular implementation in order to easily support new VR systems, update existing ones quickly and add VR compatibility to new projects with just a few clicks.
However, we struggled with the idea of how to add motion controller support to World of Diving; having to manually swim with your arms didn't really seem like an inviting experience. We still had plenty of ideas left from our brainstorming sessions however, so we chose to start a new game from scratch.
In addition to motion controller support, the Vive also introduced Roomscale VR: the possibility to walk around physically in our virtual worlds. We noticed that the first thing VR initiates do is play around with the new reality they find themselves in rather than actually playing the game in front of them. And that is exactly what we are aiming for Skyworld to be. We can experience the actual fantasy of our childhood, be the king of our own realm, and control the toy magical dragon.
A particular tricky topic in VR is that of 'user interfaces'. Having a window stuck to your head is annoying, so ideally you want the interfaces to be in the world, or be the world itself. Skyworld has had many iterations for the user interaction. Since there aren't any references or research yet as nobody really knows how to do UI's well in VR, we started off with classic UI's. In our second major pass, screen based UI was traded for world UI exclusively, which works very well in VR and is already good enough for most, but not quite perfect yet. Eventually we ended up getting rid of a bunch of UI altogether and traded it for world interaction. For example: interacting with one of your buildings now flips the table and puts you right inside where you can then give actions to your workers and see the statistics of the building instead of looking at boring graphs and numbers.
During the development of Skyworld, we've also received development kits for a couple of other, experimental, VR systems. Adding VR support for the new HMD's to our collection wasn't much effort anymore. We could now truly serve our games multi-VR-platform! The most impactful difference between different HMD's is the performance, one device might easily hit 90fps while another is dipping below 60, especially when they are running on their own devices. We have invested time in creating tools that can change the quality and settings of the game based on which HMD is running. Functionality includes toggling geometry altogether, tune down or change effects and changing the player environment.
One of the comments often made after having experienced Skyworld is: "will people have enough room for such a setup at home?". This is obviously a problem as Skyworld is an outside-in experience ( the player looks into the game world ) and you have to be able to walk around the game table. Arizona Sunshine offers the inverse: the game comes towards you. Since you are the center of attention, you don't need as much space to play.
Arizona Sunshine was one of our earlier brainstorm ideas and has proven to be a fan favorite of many friends and family who have played our VR-games and -prototypes. Shooting guns in VR is fun; it is a completely new experience as you can actual point your guns in all directions. But the thing that is simply magic, is seeing one of your friends in VR, who is holding your back against an endless wave of zombies, and then you mock him by making funny gestures when you rack in more kills.
One of our current challenges is the visualization of players in VR. Inverse Kinematics for the local player is simply weird, there are too many degrees of freedom so there will always be cases where the body does not match up with your real-world counterpart. Therefore it is already a common practice in VR development to only visualize the hands of the player, no arms attached, and this works fine.
We are currently living a milestone in the evolution of the human species in other dimensions. At the moment we simply visualize our avatars by floating heads and hands. But just how many years will it take us before we equip our full body suits in the morning and plug ourselves into the Metaverse?