You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* Change definition of `entropy(diff_ent_est, ...)`
* move Kraskov to new interface
* port kozacghenko to new interafce
* port Zhu to new intercace
* move ZhuSign to new interface
* move Lord to new interface
* move Gao and Goria to new interface
* move all order statistics estiamtors to new intercface
* [WIP] fixing tests
* add internal function for check with given entropy def
* port all estimator tests to new interface
* check if base is same in compatibility
* remove unecessary test
* discuss difference of estimator and definition
* add discrete entropy estimator type
* add it to normalizd/maximum as well
* completel;y remoe possibility of entropy(def, differential_estimator)
* rename to MLEntropy
* docstring of entropy should use estimator
* add discrete entropy estimator to docs
* remove testing of old interface
* correct Alizadeh error
* resolve all outstanding issues
* fix broken normalized entropy code
* done
* up verson
Copy file name to clipboardExpand all lines: CHANGELOG.md
+5Lines changed: 5 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,6 +2,11 @@
2
2
3
3
Changelog is kept with respect to version 0.11 of Entropies.jl. From version v2.0 onwards, this package has been renamed to ComplexityMeasures.jl.
4
4
5
+
## 2.3
6
+
- Like differential entropies, discrete entropies now also have their own estimator type.
7
+
- The approach of giving both an entropy definition, and an entropy estimator to `entropy` has been dropped. Now the entropy estimators know what definitions they are applied for. This change is a deprecation, i.e., backwards compatible.
8
+
- Added `MLEntropy` discrete entropy estimator.
9
+
5
10
## 2.2
6
11
7
12
- Corrected documentation for `SymbolicPermutation`, `SymbolicAmplitudeAwarePermutation`,
Copy file name to clipboardExpand all lines: docs/src/entropies.md
+11-3Lines changed: 11 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,9 +6,10 @@ The entropies API is defined by
6
6
7
7
-[`EntropyDefinition`](@ref)
8
8
-[`entropy`](@ref)
9
+
-[`DiscreteEntropyEstimator`](@ref)
9
10
-[`DifferentialEntropyEstimator`](@ref)
10
11
11
-
Please be sure you have read the [Terminology](@ref) section before going through the API here, to have a good idea of the different "flavors" of entropies and how they all come together over the common interface of the [`entropy`](@ref) function.
12
+
Please be sure you have read the [Terminology](@ref terminology) section before going through the API here, to have a good idea of the different "flavors" of entropies and how they all come together over the common interface of the [`entropy`](@ref) function.
Copy file name to clipboardExpand all lines: docs/src/index.md
+11-10Lines changed: 11 additions & 10 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,7 +4,7 @@
4
4
ComplexityMeasures
5
5
```
6
6
7
-
## Content and terminology
7
+
## [Content and terminology](@id terminology)
8
8
9
9
!!! note
10
10
The documentation here follows (loosely) chapter 5 of
@@ -18,9 +18,8 @@ from input data.
18
18
19
19
### Probabilities
20
20
21
-
Entropies and other complexity measures are typically computed based on *probability
22
-
distributions* (or more precisely
23
-
[*probability mass functions*](https://en.wikipedia.org/wiki/Probability_mass_function)),
21
+
Entropies and other complexity measures are typically computed based on _probability
22
+
distributions_,
24
23
which we simply refer to as "probabilities".
25
24
Probabilities can be obtained from input data in a plethora of different ways.
26
25
The central API function that returns a probability distribution
@@ -36,14 +35,16 @@ even fundamentally, different quantities.
36
35
In ComplexityMeasures.jl, we provide the generic
37
36
function [`entropy`](@ref) that tries to both clarify disparate entropy concepts, while
38
37
unifying them under a common interface that highlights the modular nature of the word
39
-
"entropy". In summary, there are only two main types of entropy.
38
+
"entropy".
40
39
41
-
-*Discrete* entropies are functions of probabilities (specifically, probability mass functions). Computing a discrete entropy boils
40
+
On the highest level, there are two main types of entropy.
41
+
42
+
-*Discrete* entropies are functions of [probability mass functions](https://en.wikipedia.org/wiki/Probability_mass_function). Computing a discrete entropy boils
42
43
down to two simple steps: first estimating a probability distribution, then plugging
43
-
the estimated probabilities into one of the so-called "generalized entropy" definitions.
44
+
the estimated probabilities into an estimator of a so-called "generalized entropy" definition.
44
45
Internally, this is literally just a few lines of code where we first apply some
45
46
[`ProbabilitiesEstimator`](@ref) to the input data, and feed the resulting
46
-
[`probabilities`](@ref) to [`entropy`](@ref) with some [`EntropyDefinition`](@ref).
47
+
[`probabilities`](@ref) to [`entropy`](@ref) with some [`DiscreteEntropyEstimator`](@ref).
47
48
-*Differential/continuous* entropies are functions of
48
49
[probability density functions](https://en.wikipedia.org/wiki/Probability_density_function),
49
50
which are *integrals*. Computing differential entropies therefore rely on estimating
@@ -59,15 +60,15 @@ They are the good old discrete Shannon entropy ([`Shannon`](@ref)), but calculat
59
60
*new probabilities estimators*.
60
61
61
62
Even though the names of these methods (e.g. "wavelet entropy") sound like names for new
62
-
entropies, they are *method* names. What these methods actually do is to devise novel
63
+
entropies, what they actually do is to devise novel
63
64
ways of calculating probabilities from data, and then plug those probabilities into formal
64
65
discrete entropy formulas such as
65
66
the Shannon entropy. These probabilities estimators are of course smartly created so that
66
67
they elegantly highlight important complexity-related aspects of the data.
67
68
68
69
Names for methods such as "permutation entropy" are commonplace, so in
69
70
ComplexityMeasures.jl we provide convenience functions like [`entropy_permutation`](@ref).
70
-
However, we emphasise that these functions really aren't anything more than
71
+
However, we emphasize that these functions really aren't anything more than
71
72
2-lines-of-code wrappers that call [`entropy`](@ref) with the appropriate
0 commit comments