Please use this identifier to cite or link to this item:
|Title:||A Markerless Tracking Method based on Sensor Fusion and Iterative Closest Point|
|Authors:||DONATO GIUSEPPE; SEQUEIRA VITOR; BOSTROEM GUNNAR; SADKA Abdul|
|Citation:||Proceedings of the IEEE SMC International Conference on Distributed Human-Machine Systems, ISBN 978-80-01-04028-7 p. 436-443|
|Publisher:||Czech Technical University in Prague|
|Type:||Articles in periodicals and books|
|Abstract:||This paper presents a hybrid tracking solution for real time volume data reconstruction and for markerless tracking. The system is designed to operate in an unprepared environment by continuously detecting and referencing the user position and orientation to an initial reference point in the world reference coordinate system without the constant support of markers. The tracking and reconstruction approach is novel and based on exiting data from a MEMS-based adequately filtered inertial tracker. Data from the inertial tracker is then used as an input estimation parameter for applying the Iterative Closest Point (ICP) algorithm to the points clouds detected with the portable time of flight 3D scanner. The result of the ICP algorithm corrects the inertial tracker data that is used both for precisely registering the volume data and for tracking the head position and orientation. The method runs iteratively in real time. Virtual models can be placed interactively by the user with reference to a selected position in the reference coordinate system and are retrieved when the object is back in the user frustum. This method allows the navigation in indoor and outdoor unprepared environments, with contemporary building and saving of the 3D data as well as in an already known or saved environment.|
|JRC Directorate:||Space, Security and Migration|
Files in This Item:
There are no files associated with this item.
Items in repository are protected by copyright, with all rights reserved, unless otherwise indicated.