Skip to main content

LUP Student Papers

LUND UNIVERSITY LIBRARIES

Self-adaptive random walk with pesudo-gradients for genetic evolution of an artificial neural network

Emanuelsson, Robin LU (2020) FYTM03 20201
Computational Biology and Biological Physics - Undergoing reorganization
Abstract
To optimize the weights in an artificial neural network most methods rely gradients, which are not always obtainable or desirable. Evolutionary algorithms are instead based on Darwinian evolution where no derivative is needed. These algorithms have a set of strategy parameters that can be dynamically updated during the search to increase performance. Two ways of updating the parameters are the so called ``1/5th-rule", which uses the offspring survival rate to self adapt, and random mutation which uses inheritance and mutation to evolve the strategy parameters as well.

We present an algorithm that combines the aspects of these two self adaptation methods by changing strategy parameters differently for new offspring and older survivors.... (More)
To optimize the weights in an artificial neural network most methods rely gradients, which are not always obtainable or desirable. Evolutionary algorithms are instead based on Darwinian evolution where no derivative is needed. These algorithms have a set of strategy parameters that can be dynamically updated during the search to increase performance. Two ways of updating the parameters are the so called ``1/5th-rule", which uses the offspring survival rate to self adapt, and random mutation which uses inheritance and mutation to evolve the strategy parameters as well.

We present an algorithm that combines the aspects of these two self adaptation methods by changing strategy parameters differently for new offspring and older survivors. We also introduce a pseudo-gradient by adding a memory of the previous step taken in the search space and let the new mutation be shifted by this remembered step. In this investigation these two methods failed to improve the performance over the ``1/5th-rule" but performed better than the random mutation. The new algorithms showed promising results regarding combining the aspects of the ``1/5th-rule" and random mutation. (Less)
Popular Abstract
Ever since the invention of the computer, there has been a demand for faster computers both for commercial and scientific use. One such scientific use is data classification, and it has turned out to have a mind boggling amount of applications; from self-driving cars to fraud detection.

Machine learning is a way to classify data. To make this process more effective scientists have turned towards nature, and especially evolution, for inspiration. In evolution the survivors pass on their characteristics to successive generations, but there is always a chance of small mutations that can benefit the individual or be a disadvantage. Since a mutation that benefits the individual increases its fitness for survival and gives it a higher chance... (More)
Ever since the invention of the computer, there has been a demand for faster computers both for commercial and scientific use. One such scientific use is data classification, and it has turned out to have a mind boggling amount of applications; from self-driving cars to fraud detection.

Machine learning is a way to classify data. To make this process more effective scientists have turned towards nature, and especially evolution, for inspiration. In evolution the survivors pass on their characteristics to successive generations, but there is always a chance of small mutations that can benefit the individual or be a disadvantage. Since a mutation that benefits the individual increases its fitness for survival and gives it a higher chance to reproduce; we have a natural selection algorithm. This way of thinking can be applied in machine learning. A candidate solution to a problem is seen as an individual and a group of these candidate solutions are called a population. The population is spread out and their positions can be seen as slight differences in genetic stock. Depending on their position, the individuals have different fitness with respect to the problem that needs be optimized. Now the best individuals, or solutions if you will, will survive and reproduce in some manner and hopefully their offspring will be even better individuals.

These computer algorithms are called evolutionary algorithms. The individuals are reproducing by taking a random step away from the surviving individual. Imagine how a tree sends out pollen in the spring. The pollen spreads in all directions from the tree, and hopefully for the tree some direction was good so a new tree can start to grow. This reproduction procedure is not the most effective way of finding new and better individuals, due to its random nature. In this project we propose a new algorithm.

Imagine yourself and a group of friends are on a quest for the perfect four-leaf clover. Once you all are at the starting point you are faced with a decision; what direction should you go to find this four-leaf clover? You all head out in different directions. You strike luck and find some regular clover. You think for yourself that this was a good direction and keep going in this general direction in hopes of finding the four-leaf clover. Some of your friends weren't so lucky and they decide to go to your position where they know there at least is some clover. They aren't so venturous now when they have already failed once so they decide to start searching much closer to your previous position. Some are starting to get their hopes up again and regain their courage and start searching in a direction that they seem to have a higher concentration of clovers while other give up completely. We hope to improve the training of computers with an algorithm that is based on this type of logic. (Less)
Please use this url to cite or link to this publication:
author
Emanuelsson, Robin LU
supervisor
organization
course
FYTM03 20201
year
type
H2 - Master's Degree (Two Years)
subject
language
English
id
9022054
date added to LUP
2020-06-24 19:12:26
date last changed
2020-06-24 19:12:26
@misc{9022054,
  abstract     = {{To optimize the weights in an artificial neural network most methods rely gradients, which are not always obtainable or desirable. Evolutionary algorithms are instead based on Darwinian evolution where no derivative is needed. These algorithms have a set of strategy parameters that can be dynamically updated during the search to increase performance. Two ways of updating the parameters are the so called ``1/5th-rule", which uses the offspring survival rate to self adapt, and random mutation which uses inheritance and mutation to evolve the strategy parameters as well. 

We present an algorithm that combines the aspects of these two self adaptation methods by changing strategy parameters differently for new offspring and older survivors. We also introduce a pseudo-gradient by adding a memory of the previous step taken in the search space and let the new mutation be shifted by this remembered step. In this investigation these two methods failed to improve the performance over the ``1/5th-rule" but performed better than the random mutation. The new algorithms showed promising results regarding combining the aspects of the ``1/5th-rule" and random mutation.}},
  author       = {{Emanuelsson, Robin}},
  language     = {{eng}},
  note         = {{Student Paper}},
  title        = {{Self-adaptive random walk with pesudo-gradients for genetic evolution of an artificial neural network}},
  year         = {{2020}},
}