Skip to main content

Lund University Publications

LUND UNIVERSITY LIBRARIES

Imitation learning of non-linear point-to-point robot motions using dirichlet processes

Krüger, Volker LU orcid ; Tikhanoff, Vadim ; Natale, Lorenzo and Sandini, Giulio (2012) 2012 IEEE International Conference on Robotics and Automation, ICRA 2012 p.2029-2034
Abstract

In this paper we discuss the use of the infinite Gaussian mixture model and Dirichlet processes for learning robot movements from demonstrations. Starting point of this work is an earlier paper where the authors learn a non-linear dynamic robot movement model from a small number of observations. The model in that work is learned using a classical finite Gaussian mixture model (FGMM) where the Gaussian mixtures are appropriately constrained. The problem with this approach is that one needs to make a good guess for how many mixtures the FGMM should use. In this work, we generalize this approach to use an infinite Gaussian mixture model (IGMM) which does not have this limitation. Instead, the IGMM automatically finds the number of mixtures... (More)

In this paper we discuss the use of the infinite Gaussian mixture model and Dirichlet processes for learning robot movements from demonstrations. Starting point of this work is an earlier paper where the authors learn a non-linear dynamic robot movement model from a small number of observations. The model in that work is learned using a classical finite Gaussian mixture model (FGMM) where the Gaussian mixtures are appropriately constrained. The problem with this approach is that one needs to make a good guess for how many mixtures the FGMM should use. In this work, we generalize this approach to use an infinite Gaussian mixture model (IGMM) which does not have this limitation. Instead, the IGMM automatically finds the number of mixtures that are necessary to reflect the data complexity. For use in the context of a non-linear dynamic model, we develop a Constrained IGMM (CIGMM). We validate our algorithm on the same data that was used in [5], where the authors use motion capture devices to record the demonstrations. As further validation we test our approach on novel data acquired on our iCub in a different demonstration scenario in which the robot is physically driven by the human demonstrator.

(Less)
Please use this url to cite or link to this publication:
author
; ; and
publishing date
type
Chapter in Book/Report/Conference proceeding
publication status
published
host publication
2012 IEEE International Conference on Robotics and Automation, ICRA 2012
article number
6224674
pages
6 pages
publisher
IEEE - Institute of Electrical and Electronics Engineers Inc.
conference name
2012 IEEE International Conference on Robotics and Automation, ICRA 2012
conference location
Saint Paul, MN, United States
conference dates
2012-05-14 - 2012-05-18
external identifiers
  • scopus:84864437277
ISBN
9781467314039
DOI
10.1109/ICRA.2012.6224674
language
English
LU publication?
no
id
ba610bc5-8e9d-4d3f-943f-bacced3225ee
date added to LUP
2019-06-28 09:21:22
date last changed
2022-02-23 07:06:39
@inproceedings{ba610bc5-8e9d-4d3f-943f-bacced3225ee,
  abstract     = {{<p>In this paper we discuss the use of the infinite Gaussian mixture model and Dirichlet processes for learning robot movements from demonstrations. Starting point of this work is an earlier paper where the authors learn a non-linear dynamic robot movement model from a small number of observations. The model in that work is learned using a classical finite Gaussian mixture model (FGMM) where the Gaussian mixtures are appropriately constrained. The problem with this approach is that one needs to make a good guess for how many mixtures the FGMM should use. In this work, we generalize this approach to use an infinite Gaussian mixture model (IGMM) which does not have this limitation. Instead, the IGMM automatically finds the number of mixtures that are necessary to reflect the data complexity. For use in the context of a non-linear dynamic model, we develop a Constrained IGMM (CIGMM). We validate our algorithm on the same data that was used in [5], where the authors use motion capture devices to record the demonstrations. As further validation we test our approach on novel data acquired on our iCub in a different demonstration scenario in which the robot is physically driven by the human demonstrator.</p>}},
  author       = {{Krüger, Volker and Tikhanoff, Vadim and Natale, Lorenzo and Sandini, Giulio}},
  booktitle    = {{2012 IEEE International Conference on Robotics and Automation, ICRA 2012}},
  isbn         = {{9781467314039}},
  language     = {{eng}},
  month        = {{01}},
  pages        = {{2029--2034}},
  publisher    = {{IEEE - Institute of Electrical and Electronics Engineers Inc.}},
  title        = {{Imitation learning of non-linear point-to-point robot motions using dirichlet processes}},
  url          = {{http://dx.doi.org/10.1109/ICRA.2012.6224674}},
  doi          = {{10.1109/ICRA.2012.6224674}},
  year         = {{2012}},
}