Omnidirectional Localization in vSLAM with Uncertainty Propagation and Bayesian Regression
This article presents a visual localization technique based solely on the use of omnidirectional images, within the framework of mobile robotics. The proposal makes use of the epipolar constraint, adapted to the omnidirectional reference, in order to deal with matching point detection, which ultimately determines a motion transformation for localizing the robot. The principal contributions lay on the propagation of the current uncertainty to the matching. Besides, a Bayesian regression technique is also implemented, in order te reinforce the robustness. As a result, we provide a reliable adaptive matching, which proves its stability and consistency against non-linear and dynamic effects affecting the image frame, and consequently the final application. In particular, the search for matching points is highly reduced, thus aiding in the search and avoiding false correspondes. The final outcome is reflected by real data experiments, which confirm the benefit of these contributions, and also test the suitability of the localization when it is embedded on a vSLAM application.