An open-source toolkit for entropic time series analysis

EntropyHub

Information and uncertainty can be regarded as two sides of the same coin: the more uncertainty there is, the more information we gain by removing that uncertainty. In the context of information and probability theory, Entropy quantifies that uncertainty.

The concept of entropy has its origins in classical physics under the second law of thermodynamics, a law considered to underpin our fundamental understanding of time in physics. Attempting to analyse the analog world around us requires that we measure time in discrete steps, but doing so compromises our ability to measure entropy accurately. Various measures have been derived to estimate entropy (uncertainty) from discrete time series, each seeking to best capture the uncertainty of the system under examination. This has resulted in many entropy statistics from approximate entropy

 

 

 

To finish reading, please visit source site