Several generations of inexpensive depth cameras have opened the possibility for new kinds of interaction on everyday surfaces.
A number of research systems have demonstrated that depth cameras combined with projectors for output can turn nearly any reasonably flat surface into a touch-sensitive display.
However even with the latest generation of depth cameras it has been difficult to obtain sufficient sensing fidelity across a table-sized surface to get much beyond a proof-of-concept demonstration.
In this research we present DIRECT a novel touch-tracking algorithm that merges depth and infrared imagery captured by a commodity sensor.
This yields significantly better touch tracking than from depth data alone as well as any prior system.
Further extending prior work DIRECT supports arbitrary user orientation and requires no prior calibration or background capture.
We describe the implementation of our system and quantify its accuracy through a comparison study of previously published depth-based touch-tracking algorithms.
Results show that our technique boosts touch detection accuracy by 15 and reduces positional error by 55 compared to the next best-performing technique.
Xiao R.
Hudson S.
E.
and Harrison C.
2016.
DIRECT: Making Touch Tracking on Ordinary Surfaces Practical with Hybrid Depth-Infrared Sensing.
In Proceedings of the 11th ACM International Conference on Interactive Surfaces and Spaces Niagara Falls Canada November 6 - 9 2016.
ISS 16.
ACM New York NY.
Источник: rutube.ru