The choice of hyperparameter(s) notably affects the support recovery in LASSO-like sparse regression problems, acting as an implicit model order selection. Parameters are typically selected using cross-validation or various ad hoc approaches. These often overestimates the resulting model order, aiming to minimize the prediction error rather than maximizing the support recovery. In this work, we propose a probabilistic approach to selecting hyperparameters in order to maximize the support recovery, quantifying the type I error (false positive rate) using extreme value analysis, such that the regularization level is selected as an appropriate quantile. By instead solving the scaled LASSO problem, the proposed choice of hyperparameter becomes almost independent of the noise variance. Simulation examples illustrate how the proposed method outperforms both cross-validation and the Bayesian Information Criterion in terms of computational complexity and support recovery.