We want to kick off the New Year by sharing a few details on the Oculus sensor’s hardware and software that allow for immersive, low-latency head tracking. Parts of this update will dig in deep on technical aspects of the sensor, so strap in! For the game designers out there, remember that a thorough and passionate understanding of sensor fusion is not required to build great VR games with the Rift, although it certainly can’t hurt ;-).
Every Millisecond Counts
When it comes to latency, every millisecond counts in making the experience as immersive as possible. There’s little argument that perfect virtual reality requires zero latency, but minimizing latency (especially in the display hardware) continues to be a major challenge.
Michael Abrash at Valve recently shared a fantastic engineering post on the technical challenges around latency for AR and VR: http://blogs.valvesoftware.com/abrash/latency-the-sine-qua-non-of-ar-and-vr/
At Oculus, we’re constantly researching ways to reduce real and perceived latency in the Rift. This means evaluating the latest hardware and developing creative solutions in software. One source of latency we decided to tackle early on was our motion sensor.
The New Oculus VR™ Sensor
The original Oculus Rift prototypes used a sensor that was readily available on the market, but ultimately we decided to develop our own sensor hardware to achieve an optimal experience. With the new Oculus VR™ sensor, we support sampling rates up to 1000hz, which minimizes the time between the player’s head movement and the game engine receiving the sensor data to roughly 2 milliseconds.
The increased sampling rates also reduce orientation error by providing a denser dataset to integrate over, making the player’s real-world movements more in-sync with the game.
The Oculus VR™ sensor includes a gyroscope, accelerometer, and magnetometer. When the data from these devices is fused, we can determine the orientation of the player’s head in the real world and synchronize the player’s virtual perspective in real-time. The Rift’s orientation is reported as a set of rotations in a right-handed coordinate system, as follows:
The process of combining the sensor data from all three devices into something useful is called “sensor fusion.” For those of you interested in learning more, we recommend this Google tech talk:
The gyroscope, which reports the rate of rotation (angular velocity) around X, Y and Z axes in radians/second, provides the most valuable data for head orientation tracking. By constantly accumulating angular velocity samples over time, the Oculus SDK can determine the direction of the Rift relative to where it began.
Although the gyroscope provides orientation relative to the starting point, it leaves us with two challenges: it can’t provide the original orientation of the headset and it’s subject to a small amount of drift over time (imagine re-orienting your head back to perfect center but in-game you’re now looking slightly left or right).
These are obviously significant issues for any VR game with a fixed reference point (ie. a game with a cockpit, where your head’s orientation does not affect the position of whatever car/plane/mech you’re piloting). Nevertheless, we can leverage the accelerometer to estimate the “down” vector and our magnetometer to measure strength and direction of the magnetic field. Combined, these allow for correction of drift in all three axes.
Making Developers’ Lives Easy…
If all of this drift correction and sensor fusion business seems like a lot of work, don’t worry! In addition to raw data, the Oculus SDK provides a SensorFusion class that takes care of the details, returning orientation data as either rotation matrices, quaternions, or Euler angles. The SDK also includes a complete C++ “Oculus Room” example that demonstrates many different player input schemes integrated with Oculus VR™ head tracking.
If you’re using Unreal or Unity, it’ll be even easier, as you can enable Oculus VR™ head tracking right out-of-the-box. We’re also working on detailed documentation and tutorials to make the integration process into proprietary engines as painless as possible.
So, did all those sensor fusion details get you excited? Good news…
Oculus VR™ is looking for the best and brightest engineers to help build the future of virtual reality. If you have what it takes to help develop the next-generation of gaming, let us know!
We’re located in sunny Irvine, California, just a few miles from Disneyland and the beach. You can always find an up to date list of available positions with detailed requirements at www1.oculus.com/careers.
Here are a few of the Jedi we’re looking for:
Senior Software Engineer
As a senior engineer, you will lead and contribute to a range of Oculus VR™ projects, including the Oculus SDK, our game engine integrations (Unreal, Unity), VR gameplay samples, internal and end-user facing tools, 3D authoring tool integrations (Maya, Max, CAD), 3D video playback, firmware and multi-platform driver development. We’re also researching positional tracking (optical and otherwise), kinematics, and alternative controller input, so there’s plenty of challenges to dig into.
Great experience would include multi-threading, multi-platform development, device drivers, 3D graphics, GPU programming, game engines, algorithm development, sensors and filters, computer vision, and a passion for VR.
Senior Mechanical Engineer
As a mechanical engineer at Oculus VR™, your designs will leave a footprint in VR history! You’ll help lead the design and engineering of Oculus VR™ products, including the Rift. You’ll also be responsible for building and testing mechanical models and translating industrial design into tooling drawings ready for manufacturing.
An ideal hire for this position has previous experience designing, building, and shipping a successful consumer electronics device (bonus points if it’s video game related!).
Senior Hardware Design and RTL Engineer
As a senior hardware design and RTL engineer at Oculus VR™, you’ll be creating the next-generation of hardware designed specifically for virtual reality. You’ll be developing custom virtual reality focused designs on FPGAs, including display controllers and low-latency stereo camera interfaces. You’ll also work with the engineering team to identify and bring to life practical solutions to virtual reality problems.
We love to see candidates with experience designing, implementing, and synthesizing high speed SerDes blocks for LVDS, TMDS, and similar interfaces using Verilog or VHDL. Hardware hackers wanted! We’re always interested in seeing cool FPGA projects.
See you in the game!
— Palmer and the Oculus VR™ team