TY - JOUR
T1 - Jensen–Fisher information and Jensen–Shannon entropy measures based on complementary discrete distributions with an application to Conway's game of life
AU - Kharazmi, Omid
AU - Contreras-Reyes, Javier E.
AU - Balakrishnan, Narayanaswamy
N1 - Publisher Copyright:
© 2023 Elsevier B.V.
PY - 2023/11
Y1 - 2023/11
N2 - Several information and divergence measures existing in the literature assist in measuring the knowledge contained in sources of information. Studying an information source from both positive and negative aspects will result in more accurate and comprehensive information. In many cases, extracting information through the positive approach could be not an easy task while it may be feasible when dealing with the negative aspect. Negation is a new perspective and direction to quantify the information or knowledge in a given system from the negative approach. In this work, we study some new information measures, such as Fisher information, Fisher information distance, Jensen–Fisher information and Jensen–Shannon entropy measures, based on complementary distributions. We then show that the proposed Jensen–Fisher information measure can be expressed based on Fisher information distance measure. We have further shown that the Jensen–Shannon entropy measure has two representations in terms of Kullback–Leibler divergence and Jensen–extropy measures. Some illustrations related to complementary distribution of Bernoulli and Poisson random variables are then presented. Finally, for illustrative purpose, we have examined a real example on Conway's game of life and have presented some numerical results in terms of the proposed information measures.
AB - Several information and divergence measures existing in the literature assist in measuring the knowledge contained in sources of information. Studying an information source from both positive and negative aspects will result in more accurate and comprehensive information. In many cases, extracting information through the positive approach could be not an easy task while it may be feasible when dealing with the negative aspect. Negation is a new perspective and direction to quantify the information or knowledge in a given system from the negative approach. In this work, we study some new information measures, such as Fisher information, Fisher information distance, Jensen–Fisher information and Jensen–Shannon entropy measures, based on complementary distributions. We then show that the proposed Jensen–Fisher information measure can be expressed based on Fisher information distance measure. We have further shown that the Jensen–Shannon entropy measure has two representations in terms of Kullback–Leibler divergence and Jensen–extropy measures. Some illustrations related to complementary distribution of Bernoulli and Poisson random variables are then presented. Finally, for illustrative purpose, we have examined a real example on Conway's game of life and have presented some numerical results in terms of the proposed information measures.
KW - Complementary distribution
KW - Conway's game of life
KW - Extropy measure
KW - Fisher information
KW - Jensen–Fisher information
KW - Jensen–Shannon entropy
UR - http://www.scopus.com/inward/record.url?scp=85162939048&partnerID=8YFLogxK
U2 - 10.1016/j.physd.2023.133822
DO - 10.1016/j.physd.2023.133822
M3 - Article
AN - SCOPUS:85162939048
SN - 0167-2789
VL - 453
JO - Physica D: Nonlinear Phenomena
JF - Physica D: Nonlinear Phenomena
M1 - 133822
ER -