Fisher Information and Uncertainty Principle for Skew-Gaussian Random Variables

Producción científica: Contribución a una revistaArtículorevisión exhaustiva

28 Citas (Scopus)

Resumen

Fisher information is a measure to quantify information and estimate system-defining parameters. The scaling and uncertainty properties of this measure, linked with Shannon entropy, are useful to characterize signals through the Fisher-Shannon plane. In addition, several non-gaussian distributions have been exemplified, given that assuming gaussianity in evolving systems is unrealistic, and the derivation of distributions that addressed asymmetry and heavy-tails is more suitable. The latter has motivated studying Fisher information and the uncertainty principle for skew-gaussian random variables for this paper. We describe the skew-gaussian distribution effect on uncertainty principle, from which the Fisher information, the Shannon entropy power, and the Fisher divergence are derived. Results indicate that flexibility of skew-gaussian distribution with a shape parameter allows deriving explicit expressions of these measures and define a new Fisher-Shannon information plane. Performance of the proposed methodology is illustrated by numerical results and applications to condition factor time series.

Idioma originalInglés
Número de artículo2150039
PublicaciónFluctuation and Noise Letters
Volumen20
N.º5
DOI
EstadoPublicada - 1 oct. 2021
Publicado de forma externa

Huella

Profundice en los temas de investigación de 'Fisher Information and Uncertainty Principle for Skew-Gaussian Random Variables'. En conjunto forman una huella única.

Citar esto