VR Design Insights
Published 10 months ago
Part One
Racket: Nx is an Arcade Space Sport for VR, developed in Unity and featuring Waves Audio’s Nx 3D audio plugin. Imagine playing racquet-ball in a 360 arena, with classic arcade-style game elements. Now put that arena in outer space and you've got RNX!
The game’s currently out on Steam Early Access, and we’ll continue its development throughout the year until our final launch in Q4 2017.
In this post and the next I'll be describing some of our development process, VR specific deliberations, and how we approached some of the multiplayer aspects of the game.
This is what the game is about!

Getting started

Racket: Nx started as a tech demo, the result of a 3 day Jam.
We were approached by Waves Audio who had developed a unity plugin for their Nx technology, and wanted to showcase it in a VR game (Waves, for those who don’t know, is a world leader in digital plug-ins for audio - there doesn't exist an musician or producer who doesn't work with their tools. Their Nx technology does realtime 3D audio simulation, and using it in VR makes a substantial improvement to ease of orientation and immersion).
So when we started the design process a few weeks earlier, what we were looking for was a solid core mechanic. We knew we wanted it to be 360 (for Nx to shine), we knew we wanted it to be physical (VR BABY!), and we knew that since there were no conventions for VR games to build on, we needed something anyone could pick up and immediately get (we also knew we wanted it to be psychedelic and spacey, for personal reasons).
After letting our minds process this for a while, separately and in hive mode, we landed on the simple concept of playing breakout/arkanoid on a 360 degree dome of bricks, with a hand held racket, and portals that could teleport the ball to a random position one could only predict via audio.
Our original tech demo did just that, and there was much rejoice, so we continued development.
Screenshot from the tech demo ^___^

Designing for VR

So we had a simple and clear core mechanic, and now we wanted to build a game around it.
On several aspects, this was no different than making any other non VR game - defining the rules, balancing the numbers, getting the right pacing. Even getting the feel of the physics and controls right was pretty straightforward.
Where we were really treading uncharted territories was everything affordance and UX related. How do we convey information to the player when we don’t know where they’re looking? How does one plan his game’s UI layout in an endless 3D environment? How do we break free from old conventions, like HUDs and mouse cursors, but keep everything intuitive? How daring can we be and get away with it?
Many of these questions still don’t have definite answers, and I think that’s part of what makes developing for VR so exciting. Still, we discovered a lot as we developed RNX, and as more VR games came out trying different approaches to these and other similar challenges.
Our Floor Timer is a prime example of this.
In RNX’s single player modes, the player plays against time. There is a timer, counting down, that can be replenished by hitting the right targets on the walls. When the timer reaches 0, you’re out.
We didn’t want a billboard with a number, we wanted to completely avoid a HUD (which never feels right in VR), and since the game is 360, we came up with the Floor Timer. Basically, the floor around the player has a circle of orange light that closes in on the player as their time runs out.
The floor timer.
Super visible and clear. Or so we thought.
People weren’t seeing it. It was right in front of them, it pulsed and beeped with increasing urgency, it encroached in discrete ticks, we even had it light up and draw backwards before each game started. Nada. People didn’t know why they were losing.
Too loose a visual association? Too much visual information to sift through? Too peripheral?
We started coming up with extreme solutions. Particles drawing attention to it, representing it on the frames between the bricks instead of the floor, etc.
Then, one day, we noticed people weren’t asking about it anymore. And we didn’t even do anything. It passed its trial by fire, and suddenly it made sense to people.
I ascribe this to two things:
  • People started getting used to being in VR. And when your conscious attention isn’t bombarded with a totally new situation to figure out, it can revert more resources to finer details.
  • Our player base was growing, people saw videos of other people playing before doing so themselves, and the information got passed along.
So this is not a story of how we cracked some VR design problem with a resounding yureka! This is a story that illustrates, I hope, some of the surprising weirdness that comes with developing for VR in these early days.
Affordance depends on the interpreter as much as it does on the messenger. And since VR literacy is still in its infancy, what works and what doesn’t is changing in front of our eyes. We are watching new conventions taking form around us, and having a say in it is a privilege.
Our UI, for example, makes a point of never using the laser pointer, but instead providing an intuitive point-and-select interface. We can do this because conventions have not yet been set in stone.
In part 2 I'll discuss a bit of our multiplayer design, specifically from the aspect of aiming at getting VR a few steps closer to eSports.
Ofer Reichman
CTO - Programmer