Robust Online 3D Reconstruction Combining a Depth Sensor and Sparse Feature Points
(2017) 2016 23rd International Conference on Pattern Recognition (ICPR 2016) p.3709-3714- Abstract
- Online 3D reconstruction has been an active research area for a long time. Since the release of the Microsoft Kinect Camera and publication of KinectFusion [11] attention has been drawn how to acquire dense models in real-time. In this paper we present a method to make online 3D reconstruction which increases robustness for scenes with little structure information and little texture information. It is shown empirically that our proposed method also increases robustness when the distance between the camera positions becomes larger than what is commonly assumed. Quantitative and qualitative results suggest that this approach can handle situations where other well-known methods fail. This is important in, for example, robotics applications... (More)
- Online 3D reconstruction has been an active research area for a long time. Since the release of the Microsoft Kinect Camera and publication of KinectFusion [11] attention has been drawn how to acquire dense models in real-time. In this paper we present a method to make online 3D reconstruction which increases robustness for scenes with little structure information and little texture information. It is shown empirically that our proposed method also increases robustness when the distance between the camera positions becomes larger than what is commonly assumed. Quantitative and qualitative results suggest that this approach can handle situations where other well-known methods fail. This is important in, for example, robotics applications like when the camera position and the 3D model must be created online in real-time. (Less)
Please use this url to cite or link to this publication:
https://lup.lub.lu.se/record/052b8cdb-edcf-42bf-8870-f54dffc39249
- author
- Bylow, Erik LU ; Olsson, Carl LU and Kahl, Fredrik LU
- organization
- publishing date
- 2017-04-24
- type
- Chapter in Book/Report/Conference proceeding
- publication status
- published
- subject
- host publication
- Pattern Recognition (ICPR), 2016 23rd International Conference on
- pages
- 6 pages
- publisher
- IEEE - Institute of Electrical and Electronics Engineers Inc.
- conference name
- 2016 23rd International Conference on Pattern Recognition (ICPR 2016)
- conference location
- Cancún, Mexico
- conference dates
- 2016-12-04 - 2016-12-08
- external identifiers
-
- scopus:85019134505
- ISBN
- 978-1-5090-4847-2
- DOI
- 10.1109/ICPR.2016.7900211
- language
- English
- LU publication?
- yes
- id
- 052b8cdb-edcf-42bf-8870-f54dffc39249
- date added to LUP
- 2017-03-17 12:53:32
- date last changed
- 2022-03-01 20:40:06
@inproceedings{052b8cdb-edcf-42bf-8870-f54dffc39249, abstract = {{Online 3D reconstruction has been an active research area for a long time. Since the release of the Microsoft Kinect Camera and publication of KinectFusion [11] attention has been drawn how to acquire dense models in real-time. In this paper we present a method to make online 3D reconstruction which increases robustness for scenes with little structure information and little texture information. It is shown empirically that our proposed method also increases robustness when the distance between the camera positions becomes larger than what is commonly assumed. Quantitative and qualitative results suggest that this approach can handle situations where other well-known methods fail. This is important in, for example, robotics applications like when the camera position and the 3D model must be created online in real-time.}}, author = {{Bylow, Erik and Olsson, Carl and Kahl, Fredrik}}, booktitle = {{Pattern Recognition (ICPR), 2016 23rd International Conference on}}, isbn = {{978-1-5090-4847-2}}, language = {{eng}}, month = {{04}}, pages = {{3709--3714}}, publisher = {{IEEE - Institute of Electrical and Electronics Engineers Inc.}}, title = {{Robust Online 3D Reconstruction Combining a Depth Sensor and Sparse Feature Points}}, url = {{http://dx.doi.org/10.1109/ICPR.2016.7900211}}, doi = {{10.1109/ICPR.2016.7900211}}, year = {{2017}}, }