Fisher Information and Uncertainty Principle for Skew-Gaussian Random Variables

Research output: Contribution to journalArticlepeer-review

28 Scopus citations

Abstract

Fisher information is a measure to quantify information and estimate system-defining parameters. The scaling and uncertainty properties of this measure, linked with Shannon entropy, are useful to characterize signals through the Fisher-Shannon plane. In addition, several non-gaussian distributions have been exemplified, given that assuming gaussianity in evolving systems is unrealistic, and the derivation of distributions that addressed asymmetry and heavy-tails is more suitable. The latter has motivated studying Fisher information and the uncertainty principle for skew-gaussian random variables for this paper. We describe the skew-gaussian distribution effect on uncertainty principle, from which the Fisher information, the Shannon entropy power, and the Fisher divergence are derived. Results indicate that flexibility of skew-gaussian distribution with a shape parameter allows deriving explicit expressions of these measures and define a new Fisher-Shannon information plane. Performance of the proposed methodology is illustrated by numerical results and applications to condition factor time series.

Original languageEnglish
Article number2150039
JournalFluctuation and Noise Letters
Volume20
Issue number5
DOIs
StatePublished - 1 Oct 2021
Externally publishedYes

Keywords

  • condition factor index
  • Fisher information
  • Fisher-Shannon plane
  • Shannon entropy
  • Skew-Gaussian distribution
  • uncertainty principle

Fingerprint

Dive into the research topics of 'Fisher Information and Uncertainty Principle for Skew-Gaussian Random Variables'. Together they form a unique fingerprint.

Cite this