Advanced

Feedforward neural networks with ReLU activation functions are linear splines

Hansson, Magnus LU and Olsson, Christoffer LU (2017) In Bachelor's Theses in Mathematical Sciences NUMK01 20171
Mathematics (Faculty of Engineering)
Abstract
In this thesis the approximation properties of feedforward articial
neural networks with one hidden layer and ReLU activation functions are examined. It is shown that functions of these kind are linear splines and the number of spline knots depend on the number of nodes in the network. In fact an upper bound can be derived for the number of knots. Furthermore, the positioning of the knots depend on the optimization of the adjustable parameters of the network. A numerical example is given where the network models are compared to linear interpolating splines with equidistant positioned knots.
Please use this url to cite or link to this publication:
author
Hansson, Magnus LU and Olsson, Christoffer LU
supervisor
organization
course
NUMK01 20171
year
type
M2 - Bachelor Degree
subject
publication/series
Bachelor's Theses in Mathematical Sciences
report number
LUNFNA-4017-2017
ISSN
1654-6229
other publication id
2017:K21
language
English
id
8929048
date added to LUP
2018-06-07 17:47:42
date last changed
2018-06-07 17:47:42
@misc{8929048,
  abstract     = {In this thesis the approximation properties of feedforward articial
neural networks with one hidden layer and ReLU activation functions are examined. It is shown that functions of these kind are linear splines and the number of spline knots depend on the number of nodes in the network. In fact an upper bound can be derived for the number of knots. Furthermore, the positioning of the knots depend on the optimization of the adjustable parameters of the network. A numerical example is given where the network models are compared to linear interpolating splines with equidistant positioned knots.},
  author       = {Hansson, Magnus and Olsson, Christoffer},
  issn         = {1654-6229},
  language     = {eng},
  note         = {Student Paper},
  series       = {Bachelor's Theses in Mathematical Sciences},
  title        = {Feedforward neural networks with ReLU activation functions are linear splines},
  year         = {2017},
}