Ir al contenido

Documat


Resumen de Underwater navigation and mapping with an omnidirectional optical sensor

Josep Bosch Alay

  • Omnidirectional vision has received increasing interest during the last decade from the computer vision community. A large number of camera models have reached the market to meet the increasing demand for panoramic imagery. However, the use of omnidirectional cameras underwater is still very limited. In this thesis we propose a number of methods to create a reference resource for designing, calibrating and using underwater omnidirectional multi-camera systems.

    The first problem we address is the design and calibration of omnidirectional cameras for the underwater domain. Among the different imaging system approaches to capturing omnidirectional imagery we chose the use of multi-cameras, due to the higher resolution and quality of the final images obtained. In order to assist the design and insure a proper view coverage, a field-of-view (FOV) simulator has been developed which takes into account the individual FOVs of the cameras, the position and orientation between them and the geometry and relative pose of the waterproof housing. The latter is especially relevant due to the strong image distortions caused by the refraction of the optical rays when travelling through the different media. Nonetheless, once the system is built, a very accurate calibration is required for any metrology or computer vision application. So, a full calibration method is presented for the estimation of both the intrinsic and extrinsic parameters of the cameras and the relative pose of the waterproof housing. This method is able to cope with wide-angle lenses and non-overlapping cameras simultaneously and applicable to both land or water Omnidirectional Multi-camera Systems (OMS).

    Next, the topic of stitching strategies, to generate omnidirectional panoramas from the individual images, is studied in depth. Stitching strategies have the complex objective of joining the images in a way such that the viewer has the feeling the panoramas were captured from a single location. Conventional approaches either assume that the world is a simple sphere around the camera or use feature-based stitching techniques to align the individual images. However, this leads to artifacts and misalignments in the final panoramas due to parallax effects. This thesis presents a set of new stitching strategies, for both online and offline applications, aiming at processing the images according to available information of the multi-camera system and the environment.

    Finally, we focus on potential underwater applications. We first explore the promising uses of omnidirectional cameras to create immersive virtual experiences. Then, we demonstrate the capabilities of omnidirectional cameras as complementary sensors for the navigation of underwater robots. Specifically, we present a new tracking system for autonomous underwater vehicles (AUVs) navigating in a close formation. The proposed system, which makes use of active light marker estimates the pose of a target vehicle at short ranges, with high accuracy and execution speed.

    In order to validate all presented algorithms, two custom omnidirectional cameras were built and several experiments with divers and underwater robots have been carried out to collect the necessary data.


Fundación Dialnet

Mi Documat