|
1. A. Kraskov, H. Stögbauer, and P. Grassberger, 2004, "Estimating Mutual Information", Physical Review E(RPE), E 69, 066138 2. M. I. Belghazi, A. Baratin, and S. Rajeswar, 2018, "Mutual Information Neural Estimation", arXiv: 1801.04062v4 3. S. Park, P. M. Pradalos, 2021," Deep Data Density Estimation through Donsker-Varadhan Representation", arXiv: 1801.04062v4 4. K. Rose, E. Gurewitz, G. C. Fox, 1990, " Statistical Mechanics and Phase Transitions in Clustering ", Physical Review Latter(PRL), vol. 65, 945 5. N. Tishby, F. C. Pereira, and W. Bialek, 2000, "The information bottleneck method" Proceedings of the 37th Annual Allerton Conference on Communication, Control and Computing, p.368-377, arXiv preprint physics/0004057 6. N. Zaslavsky, N. Tishby, 2019, " Deterministic annealing and the evolution of Information Bottleneck representations ", https://www.nogsky.com/publication/2019-evo-ib/ 7. S. Gao, G. V. Steeg, A. Galstyan, "Efficient Estimation of Mutual Information for Strongly Dependent Variables", arXiv: 1411.2003 8. S. Park, P. M. Pardalos, "Deep Data Density Estimation through Donsker-Varadhan Representation", arXiv: 2104.06612 9. G. Chechik, A. Golberson, N. Tishby, and Y. Weiss, "Information Bottleneck for Gaussian Variables" Journal of Machine Learning Reasearch, vol. 6, pp. 165-188, 2005
|