When an autonomous vehicle has to carry out a mission, one of the most important aspects is its localization within the mission area. This task is already a challenge in indoor environments, where walls, doors, corners and other man-made objects provide clear features to be associated in time to build a map of the environment. Unfortunately, localization becomes more difficult when the mission takes place in an unstructured location, and especially in underwater environment.
Over the past few years, underwater vehicles have greatly improved as a tool for undersea exploration and navigation. In particular, autonomous navigation, localization, and mapping through optical imaging have become topics of interest for researchers in both underwater robotics and marine science. Underwater imagery can be used to construct image composites (photomosaics) used in many different application areas such as underwater surveying and navigation. For surveying operations with a low-cost robot limited to a down-looking camera and a sonar altimeter, it is common practice to ensure that there is enough overlap between time-consecutive images as this is the only data source of navigation.
When the robot revisits a previously surveyed area, it is essential to detect and match the non-time-consecutive images to close a loop and, thus, improve trajectory estimation. While creating the mosaic, most of the existing algorithms try to match all image pairs to detect the non-time-consecutive overlapping images when there is no additional navigation information.
|Autonomous Underwater Vehicle (Girona-500)|