Advanced

Applying Dropout to Prevent Shallow Neural Networks from Overtraining

Huynh, Denhanh LU (2017) FYTK02 20161
Computational Biology and Biological Physics
Abstract
Artificial neural networks are machine learning systems based on the neural networks of the human brain. A problem that has to be overcome for neural networks is overtraining, which means that the network performs well on data that has been used for training the network, but does not make good predictions on new data. One branch of artificial neural networks, called deep neural networks, uses a lot of hidden layers of neurons to produce state-of-the-art results on a wide variety of problems. Because of the size of these networks, training requires a lot of computation, and some methods for dealing with overtraining that are available for shallow neural networks, with only a few hidden layers, become impractical. Dropout is a recently... (More)
Artificial neural networks are machine learning systems based on the neural networks of the human brain. A problem that has to be overcome for neural networks is overtraining, which means that the network performs well on data that has been used for training the network, but does not make good predictions on new data. One branch of artificial neural networks, called deep neural networks, uses a lot of hidden layers of neurons to produce state-of-the-art results on a wide variety of problems. Because of the size of these networks, training requires a lot of computation, and some methods for dealing with overtraining that are available for shallow neural networks, with only a few hidden layers, become impractical. Dropout is a recently developed method to reduce overtraining without being too computationally demanding for deep neural networks. In this project, however, dropout is applied to shallow neural networks, and in this thesis it is shown that dropout is a good way to reduce overtraining in shallow neural networks on a variety of classification problems. (Less)
Popular Abstract (Swedish)
Dagens samhälle blir mer och mer datoriserad och tekniken blir allt mer avancerad. I teknikens framkant finner man artificiell intelligens. Idag finns det artificiell intelligens som kan vinna över världsmästare i en mängd olika brädspel som schack och go, kan analysera och förstå bild och tal, och hjälpa läkare med diagnosering av patienter. Artificiell intelligens tillämpas också i självkörande bilar. Detta är möjligt på grund av utvecklingen av artificiella neurala nätverk, som är baserade på neuronerna och och de komplexa neurala nätverken i hjärnan. En relativt ny metod kallad dropout har visats kunna förbättra förmågan hos djupa neurala nätverk att kunna lösa nya uppgifter baserat på tidigare exempel. Detta projektet handlar om att... (More)
Dagens samhälle blir mer och mer datoriserad och tekniken blir allt mer avancerad. I teknikens framkant finner man artificiell intelligens. Idag finns det artificiell intelligens som kan vinna över världsmästare i en mängd olika brädspel som schack och go, kan analysera och förstå bild och tal, och hjälpa läkare med diagnosering av patienter. Artificiell intelligens tillämpas också i självkörande bilar. Detta är möjligt på grund av utvecklingen av artificiella neurala nätverk, som är baserade på neuronerna och och de komplexa neurala nätverken i hjärnan. En relativt ny metod kallad dropout har visats kunna förbättra förmågan hos djupa neurala nätverk att kunna lösa nya uppgifter baserat på tidigare exempel. Detta projektet handlar om att studera dropout genom att tillämpa metoden på små nätverk. Små nätverk har fördelarna att de är lättare att implementera och går fortare att träna, vilket gör det enklare att testa metoden på en mängd olika problem. (Less)
Please use this url to cite or link to this publication:
author
Huynh, Denhanh LU
supervisor
organization
course
FYTK02 20161
year
type
M2 - Bachelor Degree
subject
language
English
id
8899895
date added to LUP
2017-01-25 14:39:46
date last changed
2017-10-06 16:06:51
@misc{8899895,
  abstract     = {Artificial neural networks are machine learning systems based on the neural networks of the human brain. A problem that has to be overcome for neural networks is overtraining, which means that the network performs well on data that has been used for training the network, but does not make good predictions on new data. One branch of artificial neural networks, called deep neural networks, uses a lot of hidden layers of neurons to produce state-of-the-art results on a wide variety of problems. Because of the size of these networks, training requires a lot of computation, and some methods for dealing with overtraining that are available for shallow neural networks, with only a few hidden layers, become impractical. Dropout is a recently developed method to reduce overtraining without being too computationally demanding for deep neural networks. In this project, however, dropout is applied to shallow neural networks, and in this thesis it is shown that dropout is a good way to reduce overtraining in shallow neural networks on a variety of classification problems.},
  author       = {Huynh, Denhanh},
  language     = {eng},
  note         = {Student Paper},
  title        = {Applying Dropout to Prevent Shallow Neural Networks from Overtraining},
  year         = {2017},
}