Comments (8)
I don't have any students yet. I hope to find soon. I will continue bugging the Exeter people to see how I can find more students. In the meantime, I'll promote more such projects in my website.
from complexitymeasures.jl.
I argue we should revert the measure to be a complexity measure instead, and keep the note at the end that this can be generalized but one needs to come up or provide appropriate ways so that I_i makes sense in the equation of fluctuation complexity.
Generalized or not, FluctuationComplexity
should remain an information measure, not a complexity measure, because it is still just a functional of a PMF, and can be estimated just as any of the other information measures using outcome spaces, probabilities estimators and generic information estimators.
This equation is only compatible with the shannon entropy. Only the shannon entropy defines as -lop(p) the "unit" of information, and shannon entropy is just the weighted average of the information. WIth other information measures the equation of the fluctuation complexity simply doesn't make as much sense
The equation is compatible with anything. If you use anything other than Shannon entropy, it is a deviation of the Shannon information around some other summary statistic. This is just as valid as an information statistic as any other.
If one insists that a fluctuation measure - on a general basis - must compare X-type information to X-type weighted averages, then sure, the generalization does not make sense. But neither the measure description, nor the implementation, makes any such demand. The docs are also explicit that the default inputs give you the original Shannon-type measure.
I'll think about it a bit and see if there are any obvious ways of generalizing, though, because it is a good point that one should match the "unit of information" to the selected measure in order for the measure to precisely respect the original intention.
from complexitymeasures.jl.
Just quickly did some calculations. We can easily define e.g. Tsallis-type "self information" or "information content", analogous to the Shannon-type information content. The same goes for many of the other entropy types.
Perhaps a good middle ground here is just to explicitly find the "self information" expressions for each of the entropies, then use dispatch to produce a "correct"/measure specific deviation, depending on if one picks Tsallis
/Renyi
/Shannon
or something else?
from complexitymeasures.jl.
Perhaps a good middle ground here is just to explicitly find the "self information" expressions for each of the entropies, then use dispatch to produce a "correct"/measure specific deviation, depending on if one picks
Tsallis
/Renyi
/Shannon
or something else?
Yes, but this sounds like a research paper to me. If someone published this shannon fluctuation information, someone can publish the generalization.
from complexitymeasures.jl.
Yes, but this sounds like a research paper to me. If someone published this shannon fluctuation information, someone can publish the generalization.
But do we restrict measures implemented here to measures that have already been published? We've got a plethora of methods that do not appear in any journal as part of the package already.
from complexitymeasures.jl.
yeah, we don't, and it probably isn't too complex to extract the unit of information for each measure. I'm just saying that if you do, you might as well publish a short paper for it. Maybe we can get a BSc student to write a small paper about this, it appears like a low risk high reward project for a BSc student.
from complexitymeasures.jl.
yeah, we don't, and it probably isn't too complex to extract the unit of information for each measure. I'm just saying that if you do, you might as well publish a short paper for it. Maybe we can get a BSc student to write a small paper about this, it appears like a low risk high reward project for a BSc student.
I totally agree; in fact, I already started a paper draft on Overleaf to keep my notes in one place 😁 Do you have any bachelor students in mind that may be interested? This is something that shouldn't take too much time: a few simple derivations, a few example applications, and corresponding dispatch in the code here for each generalized variant of the fluctuation complexity.
from complexitymeasures.jl.
Ok, then I'll probably just write up the paper myself as soon as possible. If you want to have a read, give me a nod here, and I'll send you a link to the paper.
from complexitymeasures.jl.
Related Issues (20)
- The function `lt` in `OrdinalPatternEncoding` isn't actually used HOT 1
- Reproducibility for `OrdinalPatternEncoding` HOT 3
- It shouldn't be possible to construct an empty `CombinationEncoding` HOT 3
- Feature: "distribution entropy" HOT 3
- Feature: bubble entropy (description is WIP) HOT 4
- Feature: "increment entropy" HOT 1
- Feature: "attention entropy"
- `missing_probabilities` HOT 1
- `counts_and_outcomes` for `BubbleSortSwaps` should also accept state space sets
- Syntax with type parameter `{m}` in `OrdinalPatterns` is not harmonious with the rest of the library HOT 10
- Encoding using `Dispersion` is slower than necessary due to manual integration for normal cdf
- Encoding complex-valued data HOT 2
- [Q] How to calculate MI between two vectors? HOT 3
- Latest stable documentation has an error in the `StatisticalComplexity` docstring HOT 1
- "Amplitude entropy"
- Good-Turing probabilities estimator HOT 2
- `AddConstant` estimator lacks reference
- Latest tagged release not appearing in neither stable nor dev docs HOT 1
- `TsallisExtropy` doctoring missing some keyword arguments
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from complexitymeasures.jl.