This letter explores a plug-in estimator of second-order Tsallis entropy based on Kernel Density Estimation (KDE) and its implicit regularization process. First, it is shown that the expected value of the estimator corresponds to the entropy of an Additive White Gaussian Noise (AWGN) model. Then, we prove various relevant properties of the Tsallis entropy: It is monotonically non-decreasing under random variables addition, its derivative with respect to the Gaussian noise power is monotonically non-increasing, and it is concave in the additive noise power. From these, we derive an information metric that provides an alternative to the strategy of regularization.
On the Estimation of Tsallis Entropy and a Novel Information Measure Based on Its Properties
Aniol Martí,Ferran de Cabrera,J. Riba
Published 2023 in IEEE Signal Processing Letters
ABSTRACT
PUBLICATION RECORD
- Publication year
2023
- Venue
IEEE Signal Processing Letters
- Publication date
Unknown publication date
- Fields of study
Mathematics, Computer Science
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-28 of 28 references · Page 1 of 1
CITED BY
Showing 1-4 of 4 citing papers · Page 1 of 1