While <inline-formula> <tex-math notation="LaTeX">$\varphi $ </tex-math></inline-formula>-divergences have been extensively studied in convex analysis, their use in optimization problems often remains challenging. In this regard, one of the main shortcomings of existing methods is that the minimization of <inline-formula> <tex-math notation="LaTeX">$\varphi $ </tex-math></inline-formula>-divergences is usually performed with respect to one of their arguments, possibly within alternating optimization techniques. In this paper, we overcome this limitation by deriving new closed-form expressions for the proximity operator of such two-variable functions. This makes it possible to employ standard proximal methods for efficiently solving a wide range of convex optimization problems involving <inline-formula> <tex-math notation="LaTeX">$\varphi $ </tex-math></inline-formula>-divergences. In addition, we show that these proximity operators are useful to compute the epigraphical projection of several functions. The proposed proximal tools are numerically validated in the context of optimal query execution within database management systems, where the problem of selectivity estimation plays a central role. Experiments are carried out on small to large scale scenarios.
Proximity Operators of Discrete Information Divergences
Mireille El Gheche,Giovanni Chierchia,J. Pesquet
Published 2016 in IEEE Transactions on Information Theory
ABSTRACT
PUBLICATION RECORD
- Publication year
2016
- Venue
IEEE Transactions on Information Theory
- Publication date
2016-06-30
- Fields of study
Mathematics, Computer Science
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-99 of 99 references · Page 1 of 1
CITED BY
Showing 1-19 of 19 citing papers · Page 1 of 1