Advanced

Exploring the LASSO as a Pruning Method

Ekbom, Leo LU (2019) FYTK02 20191
Computational Biology and Biological Physics
Abstract
In this study, the efficiency of various pruning algorithms were investigated, with an emphasis on regularization methods. Pruning is a method which aims to remove excess objects from a neural network. In particular, this included the LASSO (Least Asbolute Shrinkage and Selection Operator) and the extensions derived from it, which were compared with other methods, including optimal brain damage and the elastic net. Initially, this was implemented for MLPs, but the same methods were extended to CNNs with some alterations for increased computational efficiency. Pruning was then implemented on the level of weights, neurons as well as filters. It was concluded that the LASSO tends to yield a superior sparsity on the level of weights, but the... (More)
In this study, the efficiency of various pruning algorithms were investigated, with an emphasis on regularization methods. Pruning is a method which aims to remove excess objects from a neural network. In particular, this included the LASSO (Least Asbolute Shrinkage and Selection Operator) and the extensions derived from it, which were compared with other methods, including optimal brain damage and the elastic net. Initially, this was implemented for MLPs, but the same methods were extended to CNNs with some alterations for increased computational efficiency. Pruning was then implemented on the level of weights, neurons as well as filters. It was concluded that the LASSO tends to yield a superior sparsity on the level of weights, but the group LASSO's ability to select variables simultaneously is a worthwhile addition. Also, optimal results can be obtained by combining both while regularizing the cost function. (Less)
Popular Abstract (Swedish)
I många däggdjurs hjärnor pågår en process som kallas "synaptic pruning", som går ut på att göra sig av med hjärnans mindre använda kopplingar. Likt detta kallar man processer för ta bort överflödiga objekt från neuronnätverk för "pruning". Ett av de ursprungliga sätten att åstadkomma detta var det så kallade "optimal brain damage", som definierade ett sätt att sortera parametrar så att prestationen påverkades så lite som möjligt. Man vill alltså ha svaret på hur mycket man kan skada nätverket, men fortfarande låta det bearbeta information på ett lika effektivt sätt. Det här genomförs eftersom att moderna nätverk, till exempel de som används för bildanalys eller röstigenkänning, ofta blir stora och oerhört komplexa. Genom pruning kan... (More)
I många däggdjurs hjärnor pågår en process som kallas "synaptic pruning", som går ut på att göra sig av med hjärnans mindre använda kopplingar. Likt detta kallar man processer för ta bort överflödiga objekt från neuronnätverk för "pruning". Ett av de ursprungliga sätten att åstadkomma detta var det så kallade "optimal brain damage", som definierade ett sätt att sortera parametrar så att prestationen påverkades så lite som möjligt. Man vill alltså ha svaret på hur mycket man kan skada nätverket, men fortfarande låta det bearbeta information på ett lika effektivt sätt. Det här genomförs eftersom att moderna nätverk, till exempel de som används för bildanalys eller röstigenkänning, ofta blir stora och oerhört komplexa. Genom pruning kan storleken minskas och hastigheten ökas, vilket är särskilt viktigt för mindre kraftfulla mobila apparater. Många andra metoder för pruning kretsar kring regularisering, vilket innebär att man på något sätt justerar nätverkets förlust för att få ett fördelaktigt beteende. En av de viktigaste kallas för LASSO (Least Absolute Shrinkage and Selection Operator). I den här studien testas LASSO som ett verktyg för pruning, och jämförs med många andra metoder, för olika typer av nätverk. (Less)
Please use this url to cite or link to this publication:
author
Ekbom, Leo LU
supervisor
organization
course
FYTK02 20191
year
type
M2 - Bachelor Degree
subject
keywords
neural networks, pruning, lasso, group lasso, obd
report number
LU TP 19-25
language
English
id
8993061
date added to LUP
2019-08-28 09:42:46
date last changed
2019-08-28 09:42:46
@misc{8993061,
  abstract     = {In this study, the efficiency of various pruning algorithms were investigated, with an emphasis on regularization methods. Pruning is a method which aims to remove excess objects from a neural network. In particular, this included the LASSO (Least Asbolute Shrinkage and Selection Operator) and the extensions derived from it, which were compared with other methods, including optimal brain damage and the elastic net. Initially, this was implemented for MLPs, but the same methods were extended to CNNs with some alterations for increased computational efficiency. Pruning was then implemented on the level of weights, neurons as well as filters. It was concluded that the LASSO tends to yield a superior sparsity on the level of weights, but the group LASSO's ability to select variables simultaneously is a worthwhile addition. Also, optimal results can be obtained by combining both while regularizing the cost function.},
  author       = {Ekbom, Leo},
  keyword      = {neural networks,pruning,lasso,group lasso,obd},
  language     = {eng},
  note         = {Student Paper},
  title        = {Exploring the LASSO as a Pruning Method},
  year         = {2019},
}