A Markerless Tracking Method based on Sensor Fusion and Iterative Closest Point
This paper presents a hybrid tracking solution for real time volume data reconstruction and for markerless tracking.
The system is designed to operate in an unprepared environment by continuously detecting and referencing the user position and orientation to an initial reference point in the world reference coordinate system without the constant support of markers. The tracking and reconstruction approach is novel and based on exiting data from a MEMS-based adequately filtered inertial tracker. Data from the inertial tracker is then used as an input estimation parameter for applying the Iterative Closest Point (ICP) algorithm to the points clouds detected with the portable time of flight 3D scanner. The result of the ICP algorithm corrects the inertial tracker data that is used both for precisely registering the volume data and for tracking the head position and orientation. The method runs iteratively in real time. Virtual models can be placed interactively by the user with reference to a selected position in the reference coordinate system and are retrieved when the object is back in the user frustum. This method allows the navigation in indoor and outdoor unprepared environments, with contemporary building and saving of the 3D data as well as in an already known or saved environment.
DONATO Giuseppe;
SEQUEIRA Vitor;
BOSTROEM Gunnar;
SADKA Abdul;
2009-01-16
Czech Technical University in Prague
JRC42222
https://publications.jrc.ec.europa.eu/repository/handle/JRC42222,
Additional supporting files
| File name | Description | File type | |