Advanced

Online Recognition of Actions Involving Objects

Gharaee, Zahra LU ; Gärdenfors, Peter LU and Johnsson, Magnus LU (2017) In Cognitive Processing
Abstract
We present an online system for real time recognition of actions involving objects working in online mode. The system merges two streams of information pro- cessing running in parallel. One is carried out by a hierarchical self-organizing map (SOM) system that recognizes the performed actions by analysing the spa- tial trajectories of the agent’s movements. It consists of two layers of SOMs and a custom made supervised neural network. The activation sequences in the first layer SOM represent the sequences of significant postures of the agent during the performance of actions. These activation sequences are subsequently recoded and clustered in the second layer SOM, and then labeled by the ac- tivity in the third layer custom made... (More)
We present an online system for real time recognition of actions involving objects working in online mode. The system merges two streams of information pro- cessing running in parallel. One is carried out by a hierarchical self-organizing map (SOM) system that recognizes the performed actions by analysing the spa- tial trajectories of the agent’s movements. It consists of two layers of SOMs and a custom made supervised neural network. The activation sequences in the first layer SOM represent the sequences of significant postures of the agent during the performance of actions. These activation sequences are subsequently recoded and clustered in the second layer SOM, and then labeled by the ac- tivity in the third layer custom made supervised neural network. The second information processing stream is carried out by a second system that determines which object among several in the agent’s vicinity the action is applied to. This is achieved by applying a proximity measure. The presented method combines the two information processing streams to determine what action the agent per- formed and on what object. The action recognition system has been tested with excellent performance. (Less)
Please use this url to cite or link to this publication:
author
organization
publishing date
type
Contribution to journal
publication status
in press
subject
in
Cognitive Processing
publisher
Springer
ISSN
1612-4782
project
What you see is what you do (WYSIWYD)
language
English
LU publication?
yes
id
6d1299b8-0bba-4f57-af20-fced586cf4d4 (old id 8567677)
date added to LUP
2016-01-26 08:52:59
date last changed
2017-06-22 16:00:45
@article{6d1299b8-0bba-4f57-af20-fced586cf4d4,
  abstract     = {We present an online system for real time recognition of actions involving objects working in online mode. The system merges two streams of information pro- cessing running in parallel. One is carried out by a hierarchical self-organizing map (SOM) system that recognizes the performed actions by analysing the spa- tial trajectories of the agent’s movements. It consists of two layers of SOMs and a custom made supervised neural network. The activation sequences in the first layer SOM represent the sequences of significant postures of the agent during the performance of actions. These activation sequences are subsequently recoded and clustered in the second layer SOM, and then labeled by the ac- tivity in the third layer custom made supervised neural network. The second information processing stream is carried out by a second system that determines which object among several in the agent’s vicinity the action is applied to. This is achieved by applying a proximity measure. The presented method combines the two information processing streams to determine what action the agent per- formed and on what object. The action recognition system has been tested with excellent performance.},
  author       = {Gharaee, Zahra and Gärdenfors, Peter and Johnsson, Magnus},
  issn         = {1612-4782},
  language     = {eng},
  publisher    = {Springer},
  series       = {Cognitive Processing},
  title        = {Online Recognition of Actions Involving Objects},
  year         = {2017},
}