PlumX Metrics
Embed PlumX Metrics

Jensen–Fisher information and Jensen–Shannon entropy measures based on complementary discrete distributions with an application to Conway’s game of life

Physica D: Nonlinear Phenomena, ISSN: 0167-2789, Vol: 453, Page: 133822
2023
  • 13
    Citations
  • 0
    Usage
  • 1
    Captures
  • 0
    Mentions
  • 0
    Social Media
Metric Options:   Counts1 Year3 Year

Metrics Details

  • Citations
    13
    • Citation Indexes
      13
  • Captures
    1

Article Description

Several information and divergence measures existing in the literature assist in measuring the knowledge contained in sources of information. Studying an information source from both positive and negative aspects will result in more accurate and comprehensive information. In many cases, extracting information through the positive approach could be not an easy task while it may be feasible when dealing with the negative aspect. Negation is a new perspective and direction to quantify the information or knowledge in a given system from the negative approach. In this work, we study some new information measures, such as Fisher information, Fisher information distance, Jensen–Fisher information and Jensen–Shannon entropy measures, based on complementary distributions. We then show that the proposed Jensen–Fisher information measure can be expressed based on Fisher information distance measure. We have further shown that the Jensen–Shannon entropy measure has two representations in terms of Kullback–Leibler divergence and Jensen–extropy measures. Some illustrations related to complementary distribution of Bernoulli and Poisson random variables are then presented. Finally, for illustrative purpose, we have examined a real example on Conway’s game of life and have presented some numerical results in terms of the proposed information measures.

Provide Feedback

Have ideas for a new metric? Would you like to see something else here?Let us know