HPointLoc: Point-Based Indoor Place Recognition Using Synthetic RGB-D Images


Panov A. Staroverov A.


We present a novel dataset named as HPointLoc, specially designed for exploring capabilities of visual place recognition in indoor environment and loop detection in simultaneous localization and mapping. The loop detection sub-task is especially relevant when a robot with an on-board RGB-D camera can drive past the same place (“Point”) at different angles. The dataset is based on the popular Habitat simulator, in which it is possible to generate photorealistic indoor scenes using both own sensor data and open datasets, such as Matterport3D. To study the main stages of solving the place recognition problem on the HPointLoc dataset, we proposed a new modular approach named as PNTR. It first performs an image retrieval with the Patch-NetVLAD method, then extracts keypoints and matches them using R2D2, LoFTR or SuperPoint with SuperGlue, and finally performs a camera pose optimization step with TEASER++. Such a solution to the place recognition problem has not been previously studied in existing publications. The PNTR approach has shown the best quality metrics on the HPointLoc dataset and has a high potential for real use in localization systems for unmanned vehicles. The proposed dataset and framework are publicly available: https://github.com/metra4ok/HPointLoc

External links

DOI: 10.1007/978-3-031-30111-7_40

Download early version (PDF) from arXiv.org: https://arxiv.org/pdf/2212.14649.pdf

Dataset and Framework at GitHub: https://github.com/metra4ok/HPointLoc

ResearchGate: https://www.researchgate.net/publication/366789684_HPointLoc_Point-based_Indoor_Place_Recognition_using_Synthetic_RGB-D_Images

Reference link

Yudin, D., Solomentsev, Y., Musaev, R., Staroverov, A., Panov, A. I. (2023). HPointLoc: Point-Based Indoor Place Recognition Using Synthetic RGB-D Images // In: Tanveer, M., Agarwal, S., Ozawa, S., Ekbal, A., Jatowt, A. (eds) Neural Information Processing. ICONIP 2022. Lecture Notes in Computer Science, vol 13625. Springer, Cham. pp. 471–484.