A close look at developing comfortable and intuitive interactions for 3DOF VR controllers in Along Together on Daydream.
So you wanna make a game using a 3DOF controller? It's just like designing for 6DOF controllers as long as your players stay completely still and imagine their hands being attached to the end of a two foot pole. Unfortunately, player expectations don't decrease with the DOFs.
I mean, I can’t blame people for having those expectations, which is why bridging that gap was one of the most important parts of our design process for Along Together on Daydream. We went in thinking that simpler would be better, and ended up making a game that uses precise interactions to navigate a character in third person while moving objects around the world in first person at the same time.
In this post, I’ll go into detail about the successes and failures of our interaction prototypes and how we went about building our mechanics around the strengths of the Daydream controller.
Controller Interaction Prototypes
Conceptually, we liked the idea of letting the player manipulate objects directly with the motion of their controller to help build a connection between them and the game world. It’s the kind of interaction you can’t get out of a traditional gamepad and it’s perfectly suited for a 3DOF controller. By reading the controller’s gyro data, we detected changes in the controller’s orientation and moved the controlled object in the game world to mimic that motion directly.
On the plus side, it did feel really good to see an object in the world match up with your controller movements, so that was definitely a win. However, there was a big hitch with this approach: it was super uncomfortable! Very often, rotating objects to the proper orientation involved bending your wrist in awkward and uncomfortable ways!
In an attempt to solve this issue, we tried increasing the tracking speed of the object so a 90 degree rotation of the controller would map to 180 or even 360 degrees of rotation in-game, but by speeding up the object’s movement, it became more difficult to control and align properly when trying to solve puzzles. It also ruined that novel, one-to-one feel we were going for in the first place.
So it was a good first try, but we decided to move on to something else.
The idea here was to have the player mimic certain gestures with the controller to manipulate objects. For example, if the player had to interact with a giant crank, they would make a circular motion with the controller to get the crank to spin. This kind of abstracted interaction worked really well in Zack & Wiki, a game we frequently referenced during production, so we thought it would be interesting to bring it to VR!
I’m not gonna lie, I was pretty excited about this control scheme while I was working on it. It just felt really cool to interact with objects in a way that seemed natural and was implied by the object’s visuals.
After about a day of prototyping, I was all amped to show the team what I’d been working so hard on. It had come together even better than I anticipated! By interpreting accelerometer data from the controller, I was able to detect both the direction and orientation of rotation meaning the player could rotate cranks both forward and backward in any orientation using a controller with no positional tracking!
I excitedly shoved a controller in my business partner’s hand with a glint of excitement in my eye... which quickly faded as I watched in horror as he struggled - to even -- make --- the stupid crank ---- budge. He tried all kinds of hand motions and controller orientations, but to no avail. When he finally did get the crank to move, it was unpredictable and often not the motion he was going for. The crank would frequently change direction unexpectedly and just awkwardly wiggled around like this:
It was in this moment I learned what anyone who has ever worked with motion controls knows: every single person who picks up that controller moves their hand differently. What seems intuitive to one player may make no sense whatsoever to another. Even after attempts to make the gesture recognition more generous, it was clear that a one-size-fits-all solution would ultimately make object interactions pretty unsatisfying and, really, just no fun.
Hindsight being 20/20 and all, it turns out that Zack & Wiki had some similar issues. Looking at their interactions, many of them devolve into some form of waggling the controller about like a crazy person. They also have to give you a pretty detailed tutorial every time you use an item which gets kind of exhausting after awhile.
Additionally, because the controller only affords 3 degrees of freedom, neither this control type nor the previous one had clear solutions for allowing the player to push or pull objects through the environment (which is an interaction we wanted to support). So we turned our sights on a more flexible control scheme:
For this control scheme, the player could click on an object and then drag their cursor on the ground to manipulate that object along a defined axis. It worked well overall and fit naturally with the rest of our planned point-and-click gameplay.
As you can see above, the results were pretty smooth, but it had one big issue: it only worked that smoothly when the reticle was moving along flat ground which, as you may imagine, doesn’t make for very interesting level design.
The issue was that as the reticle collided with various obstacles in the environment, it would jump around to match those surfaces. This jumpy movement looked fine for the reticle itself, but caused erratic behavior of the controlled object. In an effort to solve this issue quickly, we tried dampening the controlled object’s movement relative to the reticle:
But this had its own set of stuttering issues, and, overall, made object manipulation a tedious process especially when trying to align multiple objects with one another.
Despite these issues, they seemed more like solvable problems than core issues that rendered the design unusable. PLUS this control scheme felt really natural (when it worked properly), and showed much more promise than our previous attempts. We just had to figure out some way to smooth out the object’s motion, or, conversely, some way in which to smooth out the reticle’s motion.
Smoothing Things Out
What we really needed was some kind of flat surface along which the player could slide their reticle since that was the ideal scenario for creating smooth object movement. As much as we wanted to use the ground as that surface, there were a lot of scenarios in which the “ground” is not so clearly defined. Depending on the object and its surroundings, it sometimes made sense to drag the reticle along the ground, a wall, a platform, or another oddly-oriented surface. The problem with each of those surface options, however, was none of them were guaranteed to be flat!
Enter: the interaction plane.
What we call an “interaction plane” is basically an invisible plane created when the player grabs an interactive object. While interacting with an object, we no longer allow the reticle to collide with any environment objects, and, instead, only allow it to collide with the interaction plane. Since the plane is always oriented properly to the object’s axis of movement, the player simply slides their cursor along it to easily manipulate the target object.
In the GIF above, the blue hand (the player’s reticle in this case) never penetrates below the red plane, but instead slides along its surface. Depending on the position of the reticle on the interaction plane, we determine how quickly and in which direction the object should move. In this way, the reticle always has a smooth surface on which to move and makes object manipulation much simpler and more predictable for the player.
In the end, this control scheme is the basis for environment interactions that shipped in Along Together. With just a few simple rules surrounding how the reticle interacts with the world, players can easily and intuitively manipulate all kinds of objects including ones that require pushing a pulling!
After working through and analyzing each of our interaction prototypes, I wouldn’t entirely rule out one-to-one motion or gesture controls as forms of input for 3DOF controller games. They each have their merits, but where the mechanics of Along Together tended to highlight the flaws of those control schemes, the reticle drag scheme played to the strengths of the Daydream controller and supported the strengths of our gameplay.
If you want to see this stuff in action for yourself, check out the game on Google Play and let us know what you think!