Ability to navigate independently in the surrounding space (build a map and find your position on it) & mdash; one of the key functionality, without which the autonomous function of any mobile robot, and in particular & mdash; unmanned aerial vehicle. It is not surprising that about 60% of the topics discussed at any major robotics conference somehow relate to this issue (the exact and complete answer to which has not yet been found).
In robotics, the tasks of mapping (mapping) and identifying one's position on it (localization) are combined into a single sub-task of simultaneous mapping and localization (English-simultaneous localization and mapping (SLAM)). The SLAM task can be solved in many different ways, depending on what environmental information is available to the agent, which in turn depends on what sensors are equipped with the latter.
In our research, we focus on small (up to 50 cm in diameter) multi-rotor type aircraft, which, due to design features and low payload (as well as due to low power capacity), are equipped with only compact video -cams. Therefore, we solve the SLAM problem by processing the video stream data (the so-called vSLAM, from video-SLAM).
Within this direction, effective methods of mapping and localization using video data are being developed.