Skip to main content

Lund University Publications

LUND UNIVERSITY LIBRARIES

Internal simulation of perception: a minimal neuro-robotic model

Ziemke, T ; Jirenhed, Dan-Anders LU and Hesslow, Germund LU (2005) In Neurocomputing 68. p.85-104
Abstract
This paper explores the possibility of providing robots with an 'inner world' based on internal simulation of perception rather than an explicit representational world model. First a series of initial experiments is discussed, in which recurrent neural networks were evolved to control collision-free corridor following behavior in a simulated Khepera robot and predict the next time step's sensory input as accurately as possible. Attempts to let the robot act blindly, i.e. repeatedly using its own prediction instead of the real sensory input, were not particularly successful. This motivated the second series of experiments, on which this paper focuses. A feed-forward network was used which, as above, controlled behavior and predicted sensory... (More)
This paper explores the possibility of providing robots with an 'inner world' based on internal simulation of perception rather than an explicit representational world model. First a series of initial experiments is discussed, in which recurrent neural networks were evolved to control collision-free corridor following behavior in a simulated Khepera robot and predict the next time step's sensory input as accurately as possible. Attempts to let the robot act blindly, i.e. repeatedly using its own prediction instead of the real sensory input, were not particularly successful. This motivated the second series of experiments, on which this paper focuses. A feed-forward network was used which, as above, controlled behavior and predicted sensory input. However, weight evolution was now guided by the sole fitness criterion of successful, 'blindfolded' corridor following behavior, including timely turns, as above using as input only own sensory predictions rather than actual sensory input. The trained robot is in some cases actually able to move blindly in a simple environment for hundreds of time steps, successfully handling several multi-step turns. Somewhat surprisingly, however, it does so based on self-generated input that is not particularly similar to the actual sensory values. (Less)
Please use this url to cite or link to this publication:
author
; and
organization
publishing date
type
Contribution to journal
publication status
published
subject
keywords
cognitive, sensory anticipation, inner world, simulation of perception, representation, robotics
in
Neurocomputing
volume
68
pages
85 - 104
publisher
Elsevier
external identifiers
  • wos:000232262900006
  • scopus:24144444490
ISSN
0925-2312
DOI
10.1016/j.neucom.2004.12.005
language
English
LU publication?
yes
id
c138951a-0348-4f7b-b7bf-326eed727c5d (old id 221309)
date added to LUP
2016-04-01 12:32:13
date last changed
2022-05-16 12:08:51
@article{c138951a-0348-4f7b-b7bf-326eed727c5d,
  abstract     = {{This paper explores the possibility of providing robots with an 'inner world' based on internal simulation of perception rather than an explicit representational world model. First a series of initial experiments is discussed, in which recurrent neural networks were evolved to control collision-free corridor following behavior in a simulated Khepera robot and predict the next time step's sensory input as accurately as possible. Attempts to let the robot act blindly, i.e. repeatedly using its own prediction instead of the real sensory input, were not particularly successful. This motivated the second series of experiments, on which this paper focuses. A feed-forward network was used which, as above, controlled behavior and predicted sensory input. However, weight evolution was now guided by the sole fitness criterion of successful, 'blindfolded' corridor following behavior, including timely turns, as above using as input only own sensory predictions rather than actual sensory input. The trained robot is in some cases actually able to move blindly in a simple environment for hundreds of time steps, successfully handling several multi-step turns. Somewhat surprisingly, however, it does so based on self-generated input that is not particularly similar to the actual sensory values.}},
  author       = {{Ziemke, T and Jirenhed, Dan-Anders and Hesslow, Germund}},
  issn         = {{0925-2312}},
  keywords     = {{cognitive; sensory anticipation; inner world; simulation of perception; representation; robotics}},
  language     = {{eng}},
  pages        = {{85--104}},
  publisher    = {{Elsevier}},
  series       = {{Neurocomputing}},
  title        = {{Internal simulation of perception: a minimal neuro-robotic model}},
  url          = {{http://dx.doi.org/10.1016/j.neucom.2004.12.005}},
  doi          = {{10.1016/j.neucom.2004.12.005}},
  volume       = {{68}},
  year         = {{2005}},
}