Wu, Chi-Jui and Houben, Steven and Marquardt, Nicolai (2017) EagleSense : tracking people and devices in interactive spaces using real-time top-view depth-sensing. In: CHI '17 Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems :. ACM, New York, pp. 3929-3942. ISBN 9781450346559
EaglseSenseCHI2017.pdf - Accepted Version
Available under License Creative Commons Attribution-NonCommercial.
Download (5MB)
Abstract
Real-time tracking of people's location, orientation and activities is increasingly important for designing novel ubiquitous computing applications. Top-view camera-based tracking avoids occlusion when tracking people while collaborating, but often requires complex tracking systems and advanced computer vision algorithms. To facilitate the prototyping of ubiquitous computing applications for interactive spaces, we developed EagleSense, a real-time human posture and activity recognition system with a single top-view depth sensing camera. We contribute our novel algorithm and processing pipeline, including details for calculating silhouetteextremities features and applying gradient tree boosting classifiers for activity recognition optimised for top-view depth sensing. EagleSense provides easy access to the real-time tracking data and includes tools for facilitating the integration into custom applications. We report the results of a technical evaluation with 12 participants and demonstrate the capabilities of EagleSense with application case studies.