Abstract
In information theory, mutual information reduces the uncertainty of a random variable due to the given knowledge of another random variable. The mutual information matrix method is developed to analyze the non-linear interactions in case of the high-dimensional time series. The mutual information matrix is based on the entropy and mutual information. In this paper, the mutual information matrix is extended for Tsallis entropy from the Shannon and Rényi entropies. Further, a global measure is defined based on the eigenvalues of a mutual information matrix built through the Tsallis entropy and used to quantify the total information shared in a time series. A further result on the monotonicity of the global measure based on Tsallis entropy is presented by the analysis of Tsallis entropy for Poisson distribution with different rate parameters. In addition, behaviour of proposed global measure is illustrated by numerical simulations related to vector autoregressive fractionally integrated moving average and sinusoidal models, and was applied to a multivariate time series related to ozone monitoring network stations. Comparing global measures based on Tsallis and Rényi entropies, the proposed one provides clearest results in the non-linear interaction quantification of dependence between stations than those based on Rényi and Shannon entropies.
| Original language | English |
|---|---|
| Article number | 111272 |
| Pages (from-to) | 5239-5249 |
| Number of pages | 11 |
| Journal | Nonlinear Dynamics |
| Volume | 113 |
| Issue number | 6 |
| DOIs | |
| State | Published - Mar 2025 |
Keywords
- Eigenvalues
- Mutual information
- Mutual information matrix
- Ozone time series
- Replica trick method
- Tsallis entropy