Porting Daydream’s 3DOF Arm Model to Oculus Go and Beyond (Download)
Updated a year ago
Link to the Github project is at the bottom.
1/13/2020 UPDATE : Unity's XR input now includes a pre-made arm model that is very similar to the one one from daydream. More information can be found here.
Unlike Oculus Go, Daydream offers an open source project that details best practices for it’s platform: Daydream Elements. One of the most useful things that Daydream Elements provides is an “Arm Model” script. That extrapolates arm movement from device rotation. This script allows developers to fine tune tracking parameters to create a sense of presence, while only receiving the controllers rotational input. However, their script only supports the Daydream device, so how would we port this mechanic to, say, the Oculus Go?
This article was originally posted on Medium, until I realized that Unity Connect is probably a better platform for this article, so the code examples are images because I had a hard time formatting it on Medium.

Porting the Daydream “Arm Model” to Unity XR Nodes

Creating the Base

We will use a base class for all our arm models, so they can be referenced generically. This is useful when extending the arm calculations. For example, the XRArmVisulizer (example ins asset pack) can reference an XRArm or a XRTransitionArm. We will call this class XRBaseArmModel.cs

Getting Device Data

We start by creating a script called XRArm.cs. This script will calculate the Players arm position and rotation based on the headset rotation and their 3Dof controller. This version of the Arm Model uses the following information from the device:
  • The dominant hand of the player (left or right hand)
  • Hand Rotation
  • Gaze Direction
  • Headset’s angular velocity
  • Headset’s potion (Interpreted by Unity’s Input Tracking)
To get access to the tracked devices sensors, we implement the UnityEngine.XR namespace. Unity uses “XRNodeState” to get the state of a tracked device, whether it’s a headset, controller, or pointer. You can read more about this class in the documentation :
To store the of states of the devices, we create a list to which the data can be assigned. We call the list “nodeStates”. To get the data from the tracked devices, we call InputTracking.GetNodeStates() in the Update Function. To make sure that we are updating our data once per frame. This function accepts a list of XRNodeStates, to which the data can be assigned. We will use the previously created “nodeStates” list. After our data is assigned, we can iterate through the notes and “try” to get the data that we need. We use the “Try” function because the Nodes are generic and certain values may not be tracked, depending on the device. The “Try” function on Node States creates a lot of Garbage and it can be eliminated if the device you are targeting has a function to get the Headset Velocity. For example Oculus’s OVRManager allows developers to access the headset velocity using OVRManager.display.angularVelocity.

Interpreting the position of the hand based on our data

In this section we will go over interpreting the arm rotation and position based on the device data gathered in the last section.

Handed Multiplier

Now that we have all our device data, we can calculate the Arm position and rotation. We start by creating a function that accepts all of our devices parameters mentioned earlier and we call it `UpdateArmData`
public void UpdateHandData(bool isLeftHanded,Quaternion controllerRotation, Vector3 gazeDirection, angularVelocity Vector3 headPosition){ //More Code Goes Here }
Here we will perform a series of function that calculate our arm position. We start creating a Vector 3 that equals 1 if the player is Right Handed, 0 = Center Handed/Not provided and -1 when the player is left Handed. We will store this value as “handedMultiplier”

Storing the local controller rotation

We will reference the controller rotation a few times in our calculation. So we will store it’s rotation in a separate function called “UpdateControllerReferenceRotation”, which accepts a quaternion, for easier readability. We call the stored variable “localControllerRotation”

Interpreting the player’s torso direction

Usually a person’s torso faces the direction of their gaze. However, when the player is moving. The direction of the torso is affected by the user’s rotational acceleration. Because of this, we must use an algorithm to calculate the user’s torso direction. We will do this calculation in a function called UpdateTorsoDirection, which accepts the gazeDirection and angularVelocity values. We will store the calculated values in two variables: Vector3 torsoDirection & Quaternion torsoRotation

Merge our script to Daydream Elements

Now that our code looks fairly similar to the Daydream elements code, and because the code is well commented, I decided to skip the explanation of the calculations and provide the code along with additional ports from the Daydream elements package.


  • *You will have to use your device’s SDK to set the IsLeftHanded value.
  • *The reason we use the UpdateArmData function instead of doing the calculating directly in Update, is so that this system can:
  • Be tracked from a single controller, to lerp between two arm models
  • Use a generic into a manager to passes the data to the XRArms.

Download Package and Source Code
The scripts attached are almost exact copies from the Google Daydream Elements Arm Model. These scripts include a visualizer and a script that transitions between two separate arms. This can be useful in scenarios if you want to toggle between two arm calculations like pointing and throwing.
Krystian Babilinski
Unity Developer - Programmer