Improving quality of sample entropy estimation for continuous distribution probability functions
AbstractEntropy is a one of the key parameters characterizing state of system in statistical physics. Although, the entropy is defined for systems described by discrete and continuous probability distribution function (PDF), in numerous applications the sample entropy is estimated by a histogram, which, in fact, denotes that the continuous PDF is represented by a set of probabilities. Such a procedure may lead to ambiguities and even misinterpretation of the results. Within this paper, two possible general algorithms based on continuous PDF estimation are discussed in the application to the Shannon and Tsallis entropies. It is shown that the proposed algorithms may improve entropy estimation, particularly in the case of small data sets.
|Journal series||Physica A-Statistical Mechanics and Its Applications, [Physica A: Statistical Mechanics and its Applications], ISSN 0378-4371, e-ISSN 1873-2119, (A 30 pkt)|
|Keywords in English||Entropy, Sample entropy, Data analysis|
|Score|| = 30.0, 12-11-2020, ArticleFromJournal|
= 30.0, 12-11-2020, ArticleFromJournal
|Publication indicators||= 6; = 6; : 2016 = 1.324; : 2016 = 2.243 (2) - 2016=2.146 (5)|
* presented citation count is obtained through Internet information analysis and it is close to the number calculated by the Publish or Perish system.