Skip to main content

Lund University Publications

LUND UNIVERSITY LIBRARIES

SARAFun, Smart Assembly Robot with Advanced FUNctionalities, H2020

Bekiroglu, Yasemin ; Haschke, Robert ; Karayiannidis, Yiannis LU orcid ; Mariolis, Ioannis ; McIntyre, Joseph ; Malec, Jacek LU orcid and Remazeilles, Anthony (2017) In Impact 2017(5). p.67-69
Abstract
While Industrial robots are very successful in many areas of industrial manufacturing, assembly automation still suffers from complex time consuming programming and the need of dedicated hardware. ABB has developed YuMi, a collaborative inherently safe assembly robot that is expected to reduce integration costs significantly by offering a standardized hardware setup and simple fitting of the robot into existing workplaces. Internal Pilot testing at ABB has however shown that when YuMi is programmed with traditional methods the programming time even for simple assembly tasks will remain very long. The SARAFun project has been formed to enable a non-expert user to integrate a new bi-manual assembly task on a YuMi robot in less than a day.... (More)
While Industrial robots are very successful in many areas of industrial manufacturing, assembly automation still suffers from complex time consuming programming and the need of dedicated hardware. ABB has developed YuMi, a collaborative inherently safe assembly robot that is expected to reduce integration costs significantly by offering a standardized hardware setup and simple fitting of the robot into existing workplaces. Internal Pilot testing at ABB has however shown that when YuMi is programmed with traditional methods the programming time even for simple assembly tasks will remain very long. The SARAFun project has been formed to enable a non-expert user to integrate a new bi-manual assembly task on a YuMi robot in less than a day. This will be accomplished by augmenting the YuMi robot with cutting edge sensory and cognitive abilities as well as reasoning abilities required to plan and execute an assembly task. The overall conceptual approach is that the robot should be capable of learning and executing assembly tasks in a human like manner. Studies will be made to understand how human assembly workers learn and perform assembly tasks. The human performance will be modelled and transferred to the YuMi robot as assembly skills. The robot will learn assembly tasks, such as insertion or folding, by observing the task being performed by a human instructor. The robot will then analyze the task and generate an assembly program, including exception handling, and design 3D printable fingers tailored for gripping the parts at hand. Aided by the human instructor, the robot will finally learn to perform the actual assembly task, relying on sensory feedback from vision, force and tactile sensing as well as physical human robot interaction. During this phase the robot will gradually improve its understanding of the assembly at hand until it is capable of performing the assembly in a fast and robust manner. (Less)
Please use this url to cite or link to this publication:
author
; ; ; ; ; and
organization
publishing date
type
Contribution to journal
publication status
published
subject
in
Impact
volume
2017
issue
5
pages
3 pages
ISSN
2398-7073
DOI
10.21820/23987073.2017.5.67
language
Unknown
LU publication?
yes
id
9859f6ab-49cb-429d-8606-56c355fe4951
alternative location
https://www.ingentaconnect.com/content/sil/impact/2017/00002017/00000005/art00024
date added to LUP
2022-12-28 11:02:34
date last changed
2023-01-03 13:51:35
@article{9859f6ab-49cb-429d-8606-56c355fe4951,
  abstract     = {{While Industrial robots are very successful in many areas of industrial manufacturing, assembly automation still suffers from complex time consuming programming and the need of dedicated hardware. ABB has developed YuMi, a collaborative inherently safe assembly robot that is expected to reduce integration costs significantly by offering a standardized hardware setup and simple fitting of the robot into existing workplaces. Internal Pilot testing at ABB has however shown that when YuMi is programmed with traditional methods the programming time even for simple assembly tasks will remain very long. The SARAFun project has been formed to enable a non-expert user to integrate a new bi-manual assembly task on a YuMi robot in less than a day. This will be accomplished by augmenting the YuMi robot with cutting edge sensory and cognitive abilities as well as reasoning abilities required to plan and execute an assembly task. The overall conceptual approach is that the robot should be capable of learning and executing assembly tasks in a human like manner. Studies will be made to understand how human assembly workers learn and perform assembly tasks. The human performance will be modelled and transferred to the YuMi robot as assembly skills. The robot will learn assembly tasks, such as insertion or folding, by observing the task being performed by a human instructor. The robot will then analyze the task and generate an assembly program, including exception handling, and design 3D printable fingers tailored for gripping the parts at hand. Aided by the human instructor, the robot will finally learn to perform the actual assembly task, relying on sensory feedback from vision, force and tactile sensing as well as physical human robot interaction. During this phase the robot will gradually improve its understanding of the assembly at hand until it is capable of performing the assembly in a fast and robust manner.}},
  author       = {{Bekiroglu, Yasemin and Haschke, Robert and Karayiannidis, Yiannis and Mariolis, Ioannis and McIntyre, Joseph and Malec, Jacek and Remazeilles, Anthony}},
  issn         = {{2398-7073}},
  language     = {{und}},
  number       = {{5}},
  pages        = {{67--69}},
  series       = {{Impact}},
  title        = {{SARAFun, Smart Assembly Robot with Advanced FUNctionalities, H2020}},
  url          = {{http://dx.doi.org/10.21820/23987073.2017.5.67}},
  doi          = {{10.21820/23987073.2017.5.67}},
  volume       = {{2017}},
  year         = {{2017}},
}