Skip to main content

Lund University Publications

LUND UNIVERSITY LIBRARIES

Continuous close-range 3D object pose estimation

Grossmann, Bjarne LU ; Rovida, Francesco LU and Kruger, Volker LU orcid (2019) 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2019 In IEEE International Conference on Intelligent Robots and Systems p.2861-2867
Abstract

In the context of future manufacturing lines, removing fixtures will be a fundamental step to increase the flexibility of autonomous systems in assembly and logistic operations. Vision-based 3D pose estimation is a necessity to accurately handle objects that might not be placed at fixed positions during the robot task execution. Industrial tasks bring multiple challenges for the robust pose estimation of objects such as difficult object properties, tight cycle times and constraints on camera views. In particular, when interacting with objects, we have to work with close-range partial views of objects that pose a new challenge for typical view-based pose estimation methods.In this paper, we present a 3D pose estimation method based on a... (More)

In the context of future manufacturing lines, removing fixtures will be a fundamental step to increase the flexibility of autonomous systems in assembly and logistic operations. Vision-based 3D pose estimation is a necessity to accurately handle objects that might not be placed at fixed positions during the robot task execution. Industrial tasks bring multiple challenges for the robust pose estimation of objects such as difficult object properties, tight cycle times and constraints on camera views. In particular, when interacting with objects, we have to work with close-range partial views of objects that pose a new challenge for typical view-based pose estimation methods.In this paper, we present a 3D pose estimation method based on a gradient-ascend particle filter that integrates new observations on-the-fly to improve the pose estimate. Thereby, we can apply this method online during task execution to save valuable cycle time. In contrast to other view-based pose estimation methods, we model potential views in full 6dimensional space that allows us to cope with close-range partial objects views. We demonstrate the approach on a real assembly task, in which the algorithm usually converges to the correct pose within 10-15 iterations with an average accuracy of less than 8mm.

(Less)
Please use this url to cite or link to this publication:
author
; and
organization
publishing date
type
Chapter in Book/Report/Conference proceeding
publication status
published
subject
host publication
2019 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2019
series title
IEEE International Conference on Intelligent Robots and Systems
article number
8967580
pages
7 pages
publisher
IEEE - Institute of Electrical and Electronics Engineers Inc.
conference name
2019 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2019
conference location
Macau, China
conference dates
2019-11-03 - 2019-11-08
external identifiers
  • scopus:85081154408
ISSN
2153-0866
2153-0858
ISBN
9781728140049
DOI
10.1109/IROS40897.2019.8967580
language
English
LU publication?
yes
additional info
Funding Information: The research leading to these results has received funding from the European Commission’s Horizon 2020 Programme under grant agreement no. 723658 (SCALABLE). Publisher Copyright: © 2019 IEEE. Copyright: Copyright 2020 Elsevier B.V., All rights reserved.
id
d57ce845-c5b6-4319-bb1d-3ebd47b75945
date added to LUP
2021-03-04 15:35:34
date last changed
2024-05-02 05:30:03
@inproceedings{d57ce845-c5b6-4319-bb1d-3ebd47b75945,
  abstract     = {{<p>In the context of future manufacturing lines, removing fixtures will be a fundamental step to increase the flexibility of autonomous systems in assembly and logistic operations. Vision-based 3D pose estimation is a necessity to accurately handle objects that might not be placed at fixed positions during the robot task execution. Industrial tasks bring multiple challenges for the robust pose estimation of objects such as difficult object properties, tight cycle times and constraints on camera views. In particular, when interacting with objects, we have to work with close-range partial views of objects that pose a new challenge for typical view-based pose estimation methods.In this paper, we present a 3D pose estimation method based on a gradient-ascend particle filter that integrates new observations on-the-fly to improve the pose estimate. Thereby, we can apply this method online during task execution to save valuable cycle time. In contrast to other view-based pose estimation methods, we model potential views in full 6dimensional space that allows us to cope with close-range partial objects views. We demonstrate the approach on a real assembly task, in which the algorithm usually converges to the correct pose within 10-15 iterations with an average accuracy of less than 8mm.</p>}},
  author       = {{Grossmann, Bjarne and Rovida, Francesco and Kruger, Volker}},
  booktitle    = {{2019 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2019}},
  isbn         = {{9781728140049}},
  issn         = {{2153-0866}},
  language     = {{eng}},
  pages        = {{2861--2867}},
  publisher    = {{IEEE - Institute of Electrical and Electronics Engineers Inc.}},
  series       = {{IEEE International Conference on Intelligent Robots and Systems}},
  title        = {{Continuous close-range 3D object pose estimation}},
  url          = {{http://dx.doi.org/10.1109/IROS40897.2019.8967580}},
  doi          = {{10.1109/IROS40897.2019.8967580}},
  year         = {{2019}},
}