GroupSparse Regression : With Applications in Spectral Analysis and Audio Signal Processing
(2017) Abstract
 This doctorate thesis focuses on sparse regression, a statistical modeling tool for selecting valuable predictors in underdetermined linear models. By imposing different constraints on the structure of the variable vector in the regression problem, one obtains estimates which have sparse supports, i.e., where only a few of the elements in the response variable have nonzero values. The thesis collects six papers which, to a varying extent, deals with the applications, implementations, modifications, translations, and other analysis of such problems. Sparse regression is often used to approximate additive models with intricate, nonlinear, nonsmooth or otherwise problematic functions, by creating an underdetermined model consisting of... (More)
 This doctorate thesis focuses on sparse regression, a statistical modeling tool for selecting valuable predictors in underdetermined linear models. By imposing different constraints on the structure of the variable vector in the regression problem, one obtains estimates which have sparse supports, i.e., where only a few of the elements in the response variable have nonzero values. The thesis collects six papers which, to a varying extent, deals with the applications, implementations, modifications, translations, and other analysis of such problems. Sparse regression is often used to approximate additive models with intricate, nonlinear, nonsmooth or otherwise problematic functions, by creating an underdetermined model consisting of candidate values for these functions, and linear response variables which selects among the candidates. Sparse regression is therefore a widely used tool in applications such as, e.g., image processing, audio processing, seismological and biomedical modeling, but is also frequently used for data mining applications such as, e.g., social network analytics, recommender systems, and other behavioral applications. Sparse regression is a subgroup of regularized regression problems, where a fitting term, often the sum of squared model residuals, is accompanied by a regularization term, which grows as the fit term shrinks, thereby trading off model fit for a sought sparsity pattern. Typically, the regression problems are formulated as convex optimization programs, a discipline in optimization where firstorder conditions are sufficient for optimality, a local optima is also the global optima, and where numerical methods are abundant, approachable, and often very efficient. The main focus of this thesis is structured sparsity; where the linear predictors are clustered into groups, and sparsity is assumed to be correspondingly groupwise in the response variable.
The first three papers in the thesis, AC, concerns groupsparse regression for temporal identification and spatial localization, of different features in audio signal processing. In Paper A, we derive a model for audio signals recorded on an array of microphones, arbitrarily placed in a threedimensional space. In a twostep groupsparse modeling procedure, we first identify and separate the recorded audio sources, and then localize their origins in space. In Paper B, we examine the multipitch model for tonal audio signals, such as, e.g., musical tones, tonal speech, or mechanical sounds from combustion engines. It typically models the signalofinterest using a group of spectral lines, located at some integer multiple of a fundamental frequency. In this paper, we replace the regularizers used in previous works by a groupwise total variation function, promoting a smooth spectral envelope. The proposed combination of regularizers thereby avoids the common suboctave error, where the fundamental frequency is incorrectly classified using half of the fundamental frequency. In Paper C, we analyze the performance of groupsparse regression for classification by chroma, also known as pitch class, e.g., the musical note C, independent of the octave.
The last three papers, DF, are less applicationspecific than the first three; attempting to develop the methodology of sparse regression more independently of the application. Specifically, these papers look at model order selection in groupsparse regression, which is implicitly controlled by choosing a hyperparameter, prioritizing between the regularizer and the fitting term in the optimization problem. In Papers D and E, we examine a metric from array processing, termed the covariance fitting criterion, which is seemingly hyperparameterfree, and has been shown to yield sparse estimates for underdetermined linear systems. In the paper, we propose a generalization of the covariance fitting criterion for groupsparsity, and show how it relates to the groupsparse regression problem. In Paper F, we derive a novel method for hyperparameterselection in sparse and groupsparse regression problems. By analyzing how the noise propagates into the parameter estimates, and the corresponding decision rules for sparsity, we propose selecting it as a quantile from the distribution of the maximum noise component, which we sample from using the Monte Carlo method.
(Less)
Please use this url to cite or link to this publication:
https://lup.lub.lu.se/record/f22e014c54994437958db08a44986ae1
 author
 Kronvall, Ted ^{LU}
 supervisor

 Andreas Jakobsson ^{LU}
 opponent

 Professor Heusdens, Richard, TU Delft, The Netherlands
 organization
 publishing date
 20170922
 type
 Thesis
 publication status
 published
 subject
 keywords
 sparse regression, groupsparsity, statistical modeling, regularization, hyperparameterselection, spectral analysis, audio signal processing, classification, localization, multipitch estimation, chroma, convex optimization, ADMM, cyclic coordinate descent, proximal gradient
 pages
 309 pages
 publisher
 Mathematical Statistics, Centre for Mathematical Sciences, Lund University
 defense location
 Lecture hall MH:Riesz, Matematikcentrum, SÃ¶lvegatan 18, Lund University, Faculty of Engineering.
 defense date
 20171020 13:15:00
 ISBN
 9789177534174
 9789177534181
 language
 English
 LU publication?
 yes
 id
 f22e014c54994437958db08a44986ae1
 date added to LUP
 20170920 17:23:06
 date last changed
 20181121 21:34:43
@phdthesis{f22e014c54994437958db08a44986ae1, abstract = {This doctorate thesis focuses on sparse regression, a statistical modeling tool for selecting valuable predictors in underdetermined linear models. By imposing different constraints on the structure of the variable vector in the regression problem, one obtains estimates which have sparse supports, i.e., where only a few of the elements in the response variable have nonzero values. The thesis collects six papers which, to a varying extent, deals with the applications, implementations, modifications, translations, and other analysis of such problems. Sparse regression is often used to approximate additive models with intricate, nonlinear, nonsmooth or otherwise problematic functions, by creating an underdetermined model consisting of candidate values for these functions, and linear response variables which selects among the candidates. Sparse regression is therefore a widely used tool in applications such as, e.g., image processing, audio processing, seismological and biomedical modeling, but is also frequently used for data mining applications such as, e.g., social network analytics, recommender systems, and other behavioral applications. Sparse regression is a subgroup of regularized regression problems, where a fitting term, often the sum of squared model residuals, is accompanied by a regularization term, which grows as the fit term shrinks, thereby trading off model fit for a sought sparsity pattern. Typically, the regression problems are formulated as convex optimization programs, a discipline in optimization where firstorder conditions are sufficient for optimality, a local optima is also the global optima, and where numerical methods are abundant, approachable, and often very efficient. The main focus of this thesis is structured sparsity; where the linear predictors are clustered into groups, and sparsity is assumed to be correspondingly groupwise in the response variable. <br/><br/>The first three papers in the thesis, AC, concerns groupsparse regression for temporal identification and spatial localization, of different features in audio signal processing. In Paper A, we derive a model for audio signals recorded on an array of microphones, arbitrarily placed in a threedimensional space. In a twostep groupsparse modeling procedure, we first identify and separate the recorded audio sources, and then localize their origins in space. In Paper B, we examine the multipitch model for tonal audio signals, such as, e.g., musical tones, tonal speech, or mechanical sounds from combustion engines. It typically models the signalofinterest using a group of spectral lines, located at some integer multiple of a fundamental frequency. In this paper, we replace the regularizers used in previous works by a groupwise total variation function, promoting a smooth spectral envelope. The proposed combination of regularizers thereby avoids the common suboctave error, where the fundamental frequency is incorrectly classified using half of the fundamental frequency. In Paper C, we analyze the performance of groupsparse regression for classification by chroma, also known as pitch class, e.g., the musical note C, independent of the octave. <br/> <br/>The last three papers, DF, are less applicationspecific than the first three; attempting to develop the methodology of sparse regression more independently of the application. Specifically, these papers look at model order selection in groupsparse regression, which is implicitly controlled by choosing a hyperparameter, prioritizing between the regularizer and the fitting term in the optimization problem. In Papers D and E, we examine a metric from array processing, termed the covariance fitting criterion, which is seemingly hyperparameterfree, and has been shown to yield sparse estimates for underdetermined linear systems. In the paper, we propose a generalization of the covariance fitting criterion for groupsparsity, and show how it relates to the groupsparse regression problem. In Paper F, we derive a novel method for hyperparameterselection in sparse and groupsparse regression problems. By analyzing how the noise propagates into the parameter estimates, and the corresponding decision rules for sparsity, we propose selecting it as a quantile from the distribution of the maximum noise component, which we sample from using the Monte Carlo method. <br/>}, author = {Kronvall, Ted}, isbn = {9789177534174}, language = {eng}, month = {09}, publisher = {Mathematical Statistics, Centre for Mathematical Sciences, Lund University}, school = {Lund University}, title = {GroupSparse Regression : With Applications in Spectral Analysis and Audio Signal Processing}, url = {https://lup.lub.lu.se/search/ws/files/31461074/Kronvall17_print.pdf}, year = {2017}, }