Code Monkey home page Code Monkey logo

Comments (8)

Datseris avatar Datseris commented on July 4, 2024 1

I don't have any students yet. I hope to find soon. I will continue bugging the Exeter people to see how I can find more students. In the meantime, I'll promote more such projects in my website.

from complexitymeasures.jl.

kahaaga avatar kahaaga commented on July 4, 2024

I argue we should revert the measure to be a complexity measure instead, and keep the note at the end that this can be generalized but one needs to come up or provide appropriate ways so that I_i makes sense in the equation of fluctuation complexity.

Generalized or not, FluctuationComplexity should remain an information measure, not a complexity measure, because it is still just a functional of a PMF, and can be estimated just as any of the other information measures using outcome spaces, probabilities estimators and generic information estimators.

This equation is only compatible with the shannon entropy. Only the shannon entropy defines as -lop(p) the "unit" of information, and shannon entropy is just the weighted average of the information. WIth other information measures the equation of the fluctuation complexity simply doesn't make as much sense

The equation is compatible with anything. If you use anything other than Shannon entropy, it is a deviation of the Shannon information around some other summary statistic. This is just as valid as an information statistic as any other.

If one insists that a fluctuation measure - on a general basis - must compare X-type information to X-type weighted averages, then sure, the generalization does not make sense. But neither the measure description, nor the implementation, makes any such demand. The docs are also explicit that the default inputs give you the original Shannon-type measure.

I'll think about it a bit and see if there are any obvious ways of generalizing, though, because it is a good point that one should match the "unit of information" to the selected measure in order for the measure to precisely respect the original intention.

from complexitymeasures.jl.

kahaaga avatar kahaaga commented on July 4, 2024

Just quickly did some calculations. We can easily define e.g. Tsallis-type "self information" or "information content", analogous to the Shannon-type information content. The same goes for many of the other entropy types.

Perhaps a good middle ground here is just to explicitly find the "self information" expressions for each of the entropies, then use dispatch to produce a "correct"/measure specific deviation, depending on if one picks Tsallis/Renyi/Shannon or something else?

from complexitymeasures.jl.

Datseris avatar Datseris commented on July 4, 2024

Perhaps a good middle ground here is just to explicitly find the "self information" expressions for each of the entropies, then use dispatch to produce a "correct"/measure specific deviation, depending on if one picks Tsallis/Renyi/Shannon or something else?

Yes, but this sounds like a research paper to me. If someone published this shannon fluctuation information, someone can publish the generalization.

from complexitymeasures.jl.

kahaaga avatar kahaaga commented on July 4, 2024

Yes, but this sounds like a research paper to me. If someone published this shannon fluctuation information, someone can publish the generalization.

But do we restrict measures implemented here to measures that have already been published? We've got a plethora of methods that do not appear in any journal as part of the package already.

from complexitymeasures.jl.

Datseris avatar Datseris commented on July 4, 2024

yeah, we don't, and it probably isn't too complex to extract the unit of information for each measure. I'm just saying that if you do, you might as well publish a short paper for it. Maybe we can get a BSc student to write a small paper about this, it appears like a low risk high reward project for a BSc student.

from complexitymeasures.jl.

kahaaga avatar kahaaga commented on July 4, 2024

yeah, we don't, and it probably isn't too complex to extract the unit of information for each measure. I'm just saying that if you do, you might as well publish a short paper for it. Maybe we can get a BSc student to write a small paper about this, it appears like a low risk high reward project for a BSc student.

I totally agree; in fact, I already started a paper draft on Overleaf to keep my notes in one place 😁 Do you have any bachelor students in mind that may be interested? This is something that shouldn't take too much time: a few simple derivations, a few example applications, and corresponding dispatch in the code here for each generalized variant of the fluctuation complexity.

from complexitymeasures.jl.

kahaaga avatar kahaaga commented on July 4, 2024

Ok, then I'll probably just write up the paper myself as soon as possible. If you want to have a read, give me a nod here, and I'll send you a link to the paper.

from complexitymeasures.jl.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.