Motivated by the observation that a given signal $\boldsymbol{x}$ admits sparse representations in multiple dictionaries $\boldsymbol{\Psi}_d$ but with varying levels of sparsity across dictionaries, we propose two new algorithms for the reconstruction of (approximately) sparse signals from noisy linear measurements. Our first algorithm, Co-L1, extends the well-known lasso algorithm from the L1 regularizer $\|\boldsymbol{\Psi x}\|_1$ to composite regularizers of the form $\sum_d \lambda_d \|\boldsymbol{\Psi}_d \boldsymbol{x}\|_1$ while self-adjusting the regularization weights $\lambda_d$. Our second algorithm, Co-IRW-L1, extends the well-known iteratively reweighted L1 algorithm to the same family of composite regularizers. We provide several interpretations of both algorithms: i) majorization-minimization (MM) applied to a non-convex log-sum-type penalty, ii) MM applied to an approximate $\ell_0$-type penalty, iii) MM applied to Bayesian MAP inference under a particular hierarchical prior, and iv) variational expectation-maximization (VEM) under a particular prior with deterministic unknown parameters. A detailed numerical study suggests that our proposed algorithms yield significantly improved recovery SNR when compared to their non-composite L1 and IRW-L1 counterparts.
Iteratively Reweighted ℓ1 Approaches to Sparse Composite Regularization
Published 2015 in IEEE Trans. Computational Imaging
ABSTRACT
PUBLICATION RECORD
- Publication year
2015
- Venue
IEEE Trans. Computational Imaging
- Publication date
2015-04-20
- Fields of study
Mathematics, Computer Science
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-56 of 56 references · Page 1 of 1
CITED BY
Showing 1-58 of 58 citing papers · Page 1 of 1