Entropy measuring algorithms
List of algorithms from NIST 800-90B, Schürmann and Grassberger, Hutter Prize entries and Gupta and Agarwal deep learning techniques for entropy measurement in the general (correlated/ non-IID) case:-
List (incomplete) of entropy measuring algorithms:-
-
Most Common Value Estimate.
-
Collision Estimate.
-
Markov Estimate.
-
Compression Estimate.
-
t-Tuple Estimate.
-
Longest Repeated Substring (LRS) Estimate.
-
Multi Most Common in Window Prediction Estimate.
-
Lag Prediction Estimate.
-
MultiMMC Prediction Estimate.
-
LZ78Y Prediction Estimate.
-
Ziv - Lempel.
-
Gambling & suffix trees.
-
Bayesian probability estimation.
-
Rissanen’s method.
-
Superposition of probabilities.
-
Global probability estimates.
-
Hutter Prize entries.
-
Ouija technique for asking Shannon himself.
-
And a shed load of deep learning/artificial intelligence techniques…
And there are others still, with a significant proportion based around compression theory. So what to do?