2021 article

Multiscale Sensor Fusion for Display-Centered Head Tracking

2021 IEEE CONFERENCE ON VIRTUAL REALITY AND 3D USER INTERFACES ABSTRACTS AND WORKSHOPS (VRW 2021), pp. 522–523.

By: T. Wu n & B. Watson n

author keywords: Computing methodologies; Computer graphics; Graphics systems and interfaces; Virtual reality; Computing methodologies; Artificial intelligence; Computer vision problems; Tracking
TL;DR: This paper built a combination of two widely available and low-cost trackers, a Tobii Eye Tracker and a Kinect, which is more effective than Kinect at short range, and than the Tobii at a more distant range. (via Semantic Scholar)
UN Sustainable Development Goal Categories
Source: Web Of Science
Added: August 23, 2021

Emerging display usage scenarios require head tracking both at short (< 1m) and modest (<3m) ranges. Yet it is difficult to find low-cost, unobtrusive tracking solutions that remain accurate across this range. By combining multiple head tracking solutions, we can mitigate the weaknesses of one solution with the strengths of another and improve head tracking overall. We built such a combination of two widely available and low-cost trackers, a Tobii Eye Tracker and a Kinect. The resulting system is more effective than Kinect at short range, and than the Tobii at a more distant range. In this paper, we discuss how we accomplish this sensor fusion and compare our combined system to an existing mechanical tracker to evaluate its accuracy across its combined range.