Generation of Data Sets Simulating Different Kinds of Cameras in Virtual Environments
Y. Berenguer, L. Paya, O. Reinoso, A. Peidro, L.M. Jimenez
13th International Conference on Informatics in Control, Automation and Robotics ICINCO2016  (Lisbon, Portugal, 29-31 July 2016)
Ed. Insticc  ISBN:978-989-758-198-4  - Volume 2, pp. 352-359


In this paper a platform to create different kinds of data sets from virtual environments is presented. These data sets contain some information about the visual appearance of the environment and the distance from some reference positions to all the objects. Robot localization and mapping using images are two active fields of research and new algorithms are continuously proposed. These algorithms have to be tested with several sets of images to validate them. This task can be made using actual images; however, sometimes when a change in the parameters of the vision system is needed to optimize the algorithms, this system must be replaced and new data sets must be captured. This supposes a high cost and slowing down the first stages of the development. The objective of this work is to develop a versatile tool that permits generating data sets to test efficiently mapping and localization algorithms with mobile robots. Another advantage of this platform is that the images can be generated from any position of the environment and with any rotation. Besides, the images generated have not noise; this is an advantage since it allows carrying out a preliminary test of the algorithms under ideal conditions. The virtual environment can be created easily and modified depending on the desired characteristics. At last, the platform permits carrying out another advanced tasks using the images and the

virtual environment.