Towards a Realistic Indoor World Reconstruction: Preliminary Results for an Object-Oriented 3D RGB-D Mapping
- Authors
- Jun, ChangHyun; Kang, Jaehyeon; Yeon, Suyong; Choi, Hyunga; Chung, Tae-Young; Doh, Nakju Lett
- Issue Date
- 2017
- Publisher
- TSI PRESS
- Keywords
- Real world reconstruction; 3-dimensional map; RGB-D; SLAM; SfM
- Citation
- INTELLIGENT AUTOMATION AND SOFT COMPUTING, v.23, no.2, pp.207 - 218
- Indexed
- SCIE
SCOPUS
- Journal Title
- INTELLIGENT AUTOMATION AND SOFT COMPUTING
- Volume
- 23
- Number
- 2
- Start Page
- 207
- End Page
- 218
- URI
- https://scholar.korea.ac.kr/handle/2021.sw.korea/86339
- DOI
- 10.1080/10798587.2016.1186890
- ISSN
- 1079-8587
- Abstract
- A real world reconstruction that generates cyberspace not from a computer graphics tool, but from the real world, has been one of the main issues in two different communities of robotics and computer vision under different names of Simultaneous Localization And Mapping (SLAM) and Structure from Motion (SfM). However, there have been few trials that actively integrate SLAM and SfM for possible synergy. This paper shows the real world reconstruction can be enabled through this integration. As a result, the preliminary map has been generated of which five subgoals are: Realistic view (RGB), accurate geometry (depth), applicability to multi-floor indoor building, initial classification of a possible set of objects, and full automation. To this end, an engineering framework of Acquire-Build-Comprehend (ABC) is proposed, through which a sensor system acquires an RGB-Depth point cloud from the real world, builds a three-dimensional map, and comprehends this map to yield the possible set of objects. Its performance is demonstrated by building a map for three levels of indoor building of which volume is 1,408m(3).
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - Executive Vice President for Research > Institute of Convergence Science > 1. Journal Articles
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.