-
-
Notifications
You must be signed in to change notification settings - Fork 390
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add calculation of total information content and information temperature #5011
Comments
Hey! I would like to try and work on this. |
Go ahead. |
I am confused as to in which directory to add the "information_theory.py" file and in which test directory to add the "test_information_theory.py" file. Please advise. |
|
Hello!
I plan to:
Does this approach align with Rizin’s design? Any suggestions before I start? |
Nah, a better approach is to:
This is roughly how you can calculate the entropy on data. |
Extend information properties calculation and visualisation with two more fundamental functions:
Definitions
Total information content or mutual information could be considered analogous to enthalpy in thermodynamics. For mutual information:
I(X;Y)=∑x∈X∑y∈Yp(x,y)log(p(x,y)/p(x)p(y))
Note, that mutual information is only possible to calculate relatively to something else, thus not always applicable. Thus, what could be done is to calculate mutual information compared to somethiing else, e.g. to the random uniform sequence or two data blocks against each other.
In information theory, information temperature (Tinfo) is sometimes defined as the derivative of entropy with respect to an "informational energy" or complexity measure:
Tinfo=∂H/∂Einfo
Entropy and mutual information relationship
Applications
See example (generated by an AI):
And an output example:
See these sources for more context:
The text was updated successfully, but these errors were encountered: