Chretien, Stephane and Gibberd, Alex and Roy, Sandipan (2018) Hedging parameter selection for basis pursuit. arXiv.
Full text not available from this repository.Abstract
In Compressed Sensing and high dimensional estimation, signal recovery often relies on sparsity assumptions and estimation is performed via ℓ1-penalized least-squares optimization, a.k.a. LASSO. The ℓ1 penalisation is usually controlled by a weight, also called "relaxation parameter", denoted by λ. It is commonly thought that the practical efficiency of the LASSO for prediction crucially relies on accurate selection of λ. In this short note, we propose to consider the hyper-parameter selection problem from a new perspective which combines the Hedge online learning method by Freund and Shapire, with the stochastic Frank-Wolfe method for the LASSO. Using the Hedge algorithm, we show that a our simple selection rule can achieve prediction results comparable to Cross Validation at a potentially much lower computational cost.