Skip to main content

LUP Student Papers

LUND UNIVERSITY LIBRARIES

Symbolic Regression using Genetic Programming Leveraging Neural Information Processing

Grytzell, Nanna LU (2021) In Master's Theses in Mathematical Sciences FMAM05 20202
Mathematics (Faculty of Engineering)
Abstract
Regression analysis conducted with traditional mathematical methods can be sub-optimal if the exact model of the observed data is unknown. Evolutionary computing (EC) and deep learning (DL) are viable alternatives, since regression performed with these methods tends to be less dependent on a particular model. EC are especially flexible, because they are capable of performing symbolic regression. A subfield of EC and DL is genetic programming (GP) and artificial neural networks (ANN), respectively. This master thesis examines the effects of giving a genetic programming system neural information processing capabilities, in order to bridge the gap between ANN and GP.
The approach is to compare GP, in its standard formulation, with 1)~GP... (More)
Regression analysis conducted with traditional mathematical methods can be sub-optimal if the exact model of the observed data is unknown. Evolutionary computing (EC) and deep learning (DL) are viable alternatives, since regression performed with these methods tends to be less dependent on a particular model. EC are especially flexible, because they are capable of performing symbolic regression. A subfield of EC and DL is genetic programming (GP) and artificial neural networks (ANN), respectively. This master thesis examines the effects of giving a genetic programming system neural information processing capabilities, in order to bridge the gap between ANN and GP.
The approach is to compare GP, in its standard formulation, with 1)~GP that speciates using an ANN, 2)~GP that extends the function set with ANNs. Two methods are used to measure the prediction error. The effect of the first approach is an increased noise in the convergence. This leads to an enlarged spread of the prediction error for one of our two error measures, and a mainly unchanged error for the other. The effects of the second approach is an increase in accuracy for one of the error measures, and a decrease in bloat. (Less)
Popular Abstract
The use of nature as a source of inspiration in machine learning has made it possible to develop computer programs from scratch, by simulating evolution. Such simulations are fairly complex and bring possibilities, as well as challenges, into the machine learning field.
Please use this url to cite or link to this publication:
author
Grytzell, Nanna LU
supervisor
organization
alternative title
Symbolisk regression med genetisk programmering och neural informations processering
course
FMAM05 20202
year
type
H2 - Master's Degree (Two Years)
subject
keywords
evolutionary computation, evolutionary algorithm, evolutionary algorithms, genetic algorithm, genetic programming, artificial neural networks, artificial intelligence, machine learning
publication/series
Master's Theses in Mathematical Sciences
report number
LUTFMA-3438-2021
ISSN
1404-6342
other publication id
2021:E6
language
English
id
9046100
date added to LUP
2021-06-11 16:40:50
date last changed
2021-06-11 16:40:50
@misc{9046100,
  abstract     = {{Regression analysis conducted with traditional mathematical methods can be sub-optimal if the exact model of the observed data is unknown. Evolutionary computing (EC) and deep learning (DL) are viable alternatives, since regression performed with these methods tends to be less dependent on a particular model. EC are especially flexible, because they are capable of performing symbolic regression. A subfield of EC and DL is genetic programming (GP) and artificial neural networks (ANN), respectively. This master thesis examines the effects of giving a genetic programming system neural information processing capabilities, in order to bridge the gap between ANN and GP. 
	The approach is to compare GP, in its standard formulation, with 1)~GP that speciates using an ANN, 2)~GP that extends the function set with ANNs. Two methods are used to measure the prediction error. The effect of the first approach is an increased noise in the convergence. This leads to an enlarged spread of the prediction error for one of our two error measures, and a mainly unchanged error for the other. The effects of the second approach is an increase in accuracy for one of the error measures, and a decrease in bloat.}},
  author       = {{Grytzell, Nanna}},
  issn         = {{1404-6342}},
  language     = {{eng}},
  note         = {{Student Paper}},
  series       = {{Master's Theses in Mathematical Sciences}},
  title        = {{Symbolic Regression using Genetic Programming Leveraging Neural Information Processing}},
  year         = {{2021}},
}