Skip to main content

LUP Student Papers

LUND UNIVERSITY LIBRARIES

Optimization algorithms underlying neural networks: Classification of meditative states by use of recurrent neural networks

Clancy, Oisín Hugh LU (2022) In Bachelor's Theses in Mathematical Sciences MATK11 20212
Mathematics (Faculty of Engineering)
Mathematics (Faculty of Sciences)
Centre for Mathematical Sciences
Abstract
Neural networks can be utilized for an ever widening selection of tasks. In this thesis the most common optimization algorithms underlying neural networks are investigated: classical momentum, Nesterov momentum, AdaGrad, AdaDelta, RMSprop, Adam, AdaMax, and Nadam. The underlying mathematics that these algorithms are based on is described. There is a summary of key components of a neural network—activation functions and loss functions—which provides prerequisite knowledge for understanding the place of optimization algorithms within the whole. Classification of time series data (sequential modelling) can be accomplished through the use of Recurrent Neural Networks (RNNs). These networks allow for the capture of time dependent information. A... (More)
Neural networks can be utilized for an ever widening selection of tasks. In this thesis the most common optimization algorithms underlying neural networks are investigated: classical momentum, Nesterov momentum, AdaGrad, AdaDelta, RMSprop, Adam, AdaMax, and Nadam. The underlying mathematics that these algorithms are based on is described. There is a summary of key components of a neural network—activation functions and loss functions—which provides prerequisite knowledge for understanding the place of optimization algorithms within the whole. Classification of time series data (sequential modelling) can be accomplished through the use of Recurrent Neural Networks (RNNs). These networks allow for the capture of time dependent information. A brief overview of the structure of this type of network is given. The optimization algorithms, activation functions, loss functions, and RNN that have previously been described theoretically are then implemented explicitly in a real world problem. They are used to classify EEG data of different meditative practices that entail subjective changes in experience. Through this it is shown how the mathematics underlying these tools eventually leads to results in a diverse range of scientific explorations: in this case, the classification of conscious states. (Less)
Popular Abstract
Neural networks (NNs) are the underlying tool that are at the base of the recent global impact of artificial intelligence. These tools are coarse approximations of how the neurons in our brain work. They are mathematical simplifications of the electro-chemical signalling between these neurons and attempt to model the learning process. The NNs rely on different mathematical components in order to function; making use of topics such as calculus, linear algebra, optimization, and more. The optimization component of the NNs is about ensuring that the learning process goes in the correct direction; we want the NNs to learn useful patterns in information and to reduce the error in their learning. So, we use optimisation algorithms to direct the... (More)
Neural networks (NNs) are the underlying tool that are at the base of the recent global impact of artificial intelligence. These tools are coarse approximations of how the neurons in our brain work. They are mathematical simplifications of the electro-chemical signalling between these neurons and attempt to model the learning process. The NNs rely on different mathematical components in order to function; making use of topics such as calculus, linear algebra, optimization, and more. The optimization component of the NNs is about ensuring that the learning process goes in the correct direction; we want the NNs to learn useful patterns in information and to reduce the error in their learning. So, we use optimisation algorithms to direct the NNs to reduce that error. There are various methods—algorithms—that have been created to improve this process.
This thesis gives a brief introduction on neural networks and various components of the network. It then focuses on a selection of optimisation algorithms that have been used in practice, exploring the mathematical equations that define them, and describing the differences between them.
After this, an experiment is described where a NN is implemented to distinguish between brain data that has been collected using an EEG (a brain recording device); a participant engaged in a selection of meditation practices that entail subjective changes in experience. The changes in subjective experience that a person undergoes should be correlated to different neural activity. The goal of the NN is to be able to reliably distinguish between these different meditation practices by distinguishing different patterns in the neural activity. (Less)
Please use this url to cite or link to this publication:
author
Clancy, Oisín Hugh LU
supervisor
organization
course
MATK11 20212
year
type
M2 - Bachelor Degree
subject
publication/series
Bachelor's Theses in Mathematical Sciences
report number
LUNFMA-4152-2022
ISSN
1654-6229
other publication id
2022:K24
language
English
id
9112234
date added to LUP
2024-04-15 16:48:20
date last changed
2024-04-15 16:48:20
@misc{9112234,
  abstract     = {{Neural networks can be utilized for an ever widening selection of tasks. In this thesis the most common optimization algorithms underlying neural networks are investigated: classical momentum, Nesterov momentum, AdaGrad, AdaDelta, RMSprop, Adam, AdaMax, and Nadam. The underlying mathematics that these algorithms are based on is described. There is a summary of key components of a neural network—activation functions and loss functions—which provides prerequisite knowledge for understanding the place of optimization algorithms within the whole. Classification of time series data (sequential modelling) can be accomplished through the use of Recurrent Neural Networks (RNNs). These networks allow for the capture of time dependent information. A brief overview of the structure of this type of network is given. The optimization algorithms, activation functions, loss functions, and RNN that have previously been described theoretically are then implemented explicitly in a real world problem. They are used to classify EEG data of different meditative practices that entail subjective changes in experience. Through this it is shown how the mathematics underlying these tools eventually leads to results in a diverse range of scientific explorations: in this case, the classification of conscious states.}},
  author       = {{Clancy, Oisín Hugh}},
  issn         = {{1654-6229}},
  language     = {{eng}},
  note         = {{Student Paper}},
  series       = {{Bachelor's Theses in Mathematical Sciences}},
  title        = {{Optimization algorithms underlying neural networks: Classification of meditative states by use of recurrent neural networks}},
  year         = {{2022}},
}