Skip to main content

Lund University Publications

LUND UNIVERSITY LIBRARIES

Predictions Tasks with Words and Sequences: Comparing a Novel Recurrent Architecture with the Elman Network

Gil, David LU ; Garcia, J ; Cazorla, M and Johnsson, Magnus LU (2011) International Joint Conference on Neural Networks (IJCNN) 2011 p.1207-1213
Abstract
The classical connectionist models are not well suited to working with data varying over time. According to this, temporal connectionist models have emerged and constitute a continuously growing research field. In this paper we present a novel supervised recurrent neural network architecture (SARASOM) based on the Associative Self-Organizing Map (A-SOM). The A-SOM is a variant of the Self-Organizing Map (SOM) that develops a representation of its input space as well as learns to associate its activity with an arbitrary number of additional inputs. In this context the A-SOM learns to associate its previous activity with a delay of one iteration. The performance of the SARASOM was evaluated and compared with the Elman network in a number of... (More)
The classical connectionist models are not well suited to working with data varying over time. According to this, temporal connectionist models have emerged and constitute a continuously growing research field. In this paper we present a novel supervised recurrent neural network architecture (SARASOM) based on the Associative Self-Organizing Map (A-SOM). The A-SOM is a variant of the Self-Organizing Map (SOM) that develops a representation of its input space as well as learns to associate its activity with an arbitrary number of additional inputs. In this context the A-SOM learns to associate its previous activity with a delay of one iteration. The performance of the SARASOM was evaluated and compared with the Elman network in a number of prediction tasks using sequences of letters (including some experiments with a reduced lexicon of 10 words). The results are very encouraging with SARASOM learning slightly better than the Elman network. (Less)
Please use this url to cite or link to this publication:
author
; ; and
organization
publishing date
type
Chapter in Book/Report/Conference proceeding
publication status
published
subject
host publication
[Host publication title missing]
pages
1207 - 1213
conference name
International Joint Conference on Neural Networks (IJCNN) 2011
conference location
San Jose, California, United States
conference dates
2011-07-31 - 2011-08-05
external identifiers
  • scopus:80054746210
ISSN
2161-4393
ISBN
978-1-4244-9635-8
DOI
10.1109/IJCNN.2011.6033361
project
Thinking in Time: Cognition, Communication and Learning
language
English
LU publication?
yes
id
e2c253d5-d93e-4e6e-92a6-253d8fa7740c (old id 1982547)
date added to LUP
2016-04-04 09:22:11
date last changed
2022-01-29 17:33:03
@inproceedings{e2c253d5-d93e-4e6e-92a6-253d8fa7740c,
  abstract     = {{The classical connectionist models are not well suited to working with data varying over time. According to this, temporal connectionist models have emerged and constitute a continuously growing research field. In this paper we present a novel supervised recurrent neural network architecture (SARASOM) based on the Associative Self-Organizing Map (A-SOM). The A-SOM is a variant of the Self-Organizing Map (SOM) that develops a representation of its input space as well as learns to associate its activity with an arbitrary number of additional inputs. In this context the A-SOM learns to associate its previous activity with a delay of one iteration. The performance of the SARASOM was evaluated and compared with the Elman network in a number of prediction tasks using sequences of letters (including some experiments with a reduced lexicon of 10 words). The results are very encouraging with SARASOM learning slightly better than the Elman network.}},
  author       = {{Gil, David and Garcia, J and Cazorla, M and Johnsson, Magnus}},
  booktitle    = {{[Host publication title missing]}},
  isbn         = {{978-1-4244-9635-8}},
  issn         = {{2161-4393}},
  language     = {{eng}},
  pages        = {{1207--1213}},
  title        = {{Predictions Tasks with Words and Sequences: Comparing a Novel Recurrent Architecture with the Elman Network}},
  url          = {{http://dx.doi.org/10.1109/IJCNN.2011.6033361}},
  doi          = {{10.1109/IJCNN.2011.6033361}},
  year         = {{2011}},
}