Single Object Tracking methods are yet not robust enough because they may lose the target due to occlusions or changes in the target’s appearance, and it is difficult to detect automatically when they fail. To deal with these problems, we design a novel method to improve object tracking by fusing complementary types of trackers, taking advantage of each other’s strengths, with an Extended Kalman Filter to combine them in a probabilistic way. The environment perception is performed with a 3D LiDAR sensor, so we can track the object in the point cloud and also in the front-view image constructed from the point cloud. We use our tracker-fusion method in a mobile robot to follow pedestrians, also considering the dynamic obstacles in the environment to avoid them. We show that our method allows the robot to follow the target accurately during long experimental sessions where the trackers independently fail, demonstrating the robustness of our tracker-fusion strategy.
@conference{olivas2023robust,
author={Alejandro Olivas. and Miguel Muñoz{-}Bañón. and Edison Velasco. and Fernando Torres.},
title={Robust Single Object Tracking and Following by Fusion Strategy},
booktitle={Proceedings of the 20th International Conference on Informatics in Control, Automation and Robotics - Volume 1: ICINCO},
year={2023},
pages={624-631},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0012178900003543},
isbn={978-989-758-670-5},
issn={2184-2809},
}