(Conditional) Independence testing & Markov blanket feature selection using k-NN mutual information and conditional mutual information estimators. Supports continuous, discrete, and mixed data, as well as multiprocessing.
- Updated
- Python
![]() |
VOOZH | about |
(Conditional) Independence testing & Markov blanket feature selection using k-NN mutual information and conditional mutual information estimators. Supports continuous, discrete, and mixed data, as well as multiprocessing.
Implementation of Information Bottleneck with Mutual Information Neural Estimation (MINE)
This project, developed as part of the "Information Theory and Inference" exam, aims to use different discrete and continous estimators to calculate the mutual information between layers of a RNN trained to perform a cognitive task.
Add a description, image, and links to the mutual-information-estimators topic page so that developers can more easily learn about it.
To associate your repository with the mutual-information-estimators topic, visit your repo's landing page and select "manage topics."