Rapid Robust Eye Tracking for Virtual Reality

 

Executive Summary

 

Eye tracking is essential for progress in virtual reality devices, which may be used for entertainment, medical evaluations and treatment, training, advertising and retail applications. Current eye tracking technologies are based on video oculography (VOG) which is slow and expensive, both in parts and power consumption. An alternative is photo-sensor oculography (PS-OG), which is fast and requires only cheap light sensors. Existing PS-OG technologies are sensitive to equipment shifts (when the headset moves on the user’s head) which makes them inappropriate for mobile virtual reality headsets.

 

Description of Technology

 

This technology is a novel PS-OG eye tracking device that uses machine learning algorithms to control the device. The neural network is trained on each user via calibration and compensates for equipment shifts. Combined with an innovative sensor placement structure, the method is faster than existing VOG techniques (up to 1000 Hz) and more accurate than existing PS-OG techniques. This fast, accurate eye tracking helps develop a more efficient, immersive virtual reality experience. 

 

Key Benefits

  • Rapid eye tracking: In order to use foveated rendering, eye tracking must be much faster than existing VOG technology. This reduces the computational load while increasing VR quality. 
  • Robust device: Our PS-OG technique is cheaper and has no moving parts compared to existing PS-OG technology. The novel algorithm makes it robust to equipment movements.
  • High accuracy: Accuracy is comparable to or higher than existing technology, and remains accurate even when the user is highly mobile.

 

Applications

  • Virtual Reality Headsets

 

Patent Status: 

 

Patent Pending

 

Licensing Rights Available

 

Full Rights Available

 

Inventors:

 

Oleg Komogortsev, Raimondas Zemblys

 

Tech ID:

 

TEC2018-0154

 

Patent Information:

For Information, Contact:

Raymond Devito
Technology Manager
Michigan State University
devitora@msu.edu
Keywords: