Skip to main content
Have a personal or library account? Click to login
Embo: a Python package for empirical data analysis using the Information Bottleneck Cover

Embo: a Python package for empirical data analysis using the Information Bottleneck

Open Access
|May 2021

References

  1. Tishby N, Pereira FC, Bialek W. The information bottleneck method. In Proceedings of the 37-th Annual Allerton Conference on Communication, Control and Computing. 1999; 368377. arXiv:physics/0004057.
  2. Cover TM, Thomas JA. Elements of Information Theory. Wiley, second edition. 2006. DOI: 10.1002/047174882X
  3. Slonim N, Tishby N. Document clustering using word clusters via the information bottleneck method. In Proceedings of the 23rd annual international ACM SIGIR conference on Research and development in information retrieval – SIGIR ‘00. ACM Press. 2000. DOI: 10.1145/345508.345578
  4. Hecht RM Noor E, Tishby N. Speaker recognition by Gaussian information bottleneck. In Proceedings of the 10th Annual Conference of the International Speech Communication Association. Brighton, UK. 2009.
  5. Kolchinsky A, Tracey BD, Wolpert DH. Nonlinear Information Bottleneck. Entropy. 2019; 21(12): 1181. DOI: 10.3390/e21121181
  6. Tishby N, Zaslavsky N. Deep learning and the information bottleneck principle. In 2015 IEEE Information Theory Workshop (ITW). IEEE. 2015. DOI: 10.1109/ITW.2015.7133169
  7. Achille A, Soatto S. Emergence of Invariance and Disentanglement in Deep Representations. J. Mach. Learn. Res. 2018; 19(1): 19471980. ISSN 1532-4435.
  8. Palmer SE, Marre O, Berry MJ, Bialek W. Predictive information in a sensory population. Proceedings of the National Academy of Sciences. 2015; 112(22): 69086913. ISSN 0027-8424. https://www.pnas.org/content/112/22/6908.full.pdf. DOI: 10.1073/pnas.1506855112
  9. Chalk M, Marre O, Tkačik G. Toward a unified theory of efficient, predictive, and sparse coding. Proceedings of the National Academy of Sciences. 2018; 115(1): 186191. ISSN 0027-8424. DOI: 10.1073/pnas.1711114115
  10. Filipowicz AL, Glaze CM, Kable JW, Gold JI. Pupil diameter encodes the idiosyncratic, cognitive complexity of belief updating. eLife. 2020; 9. DOI: 10.7554/eLife.57872
  11. Filipowicz A, Levine J, Piasini E, Tavoni G, Kable J, Gold J. The comparable strategic flexibility of model-free and model-based learning. biorXiv. 2020. DOI: 10.1101/2019.12.28.879965
  12. Strouse D, Schwab DJ. The Deterministic Information Bottleneck. Neural Computation. 2017; 29(6): 16111630. DOI: 10.1162/NECO_a_00961
  13. Chechik G, Globerson A, Tishby N, Weiss Y. Information Bottleneck for Gaussian Variables. J. Mach. Learn. Res. 2005; 6: 165188. ISSN 1532-4435.
  14. James RG, Ellison CJ, Crutchfield JP. dit: a Python package for discrete information theory. Journal of Open Source Software. 2018; 3(25): 738. DOI: 10.21105/joss.00738
  15. Creutzig F, Globerson A, Tishby N. Past-future information bottleneck in dynamical systems. Physical Review E. 2009; 79(4). DOI: 10.1103/PhysRevE.79.041925
  16. Harris CR, Millman KJ, van der Walt SJ, Gommers R, Virtanen P, Cournapeau D, Wieser E, Taylor J, Berg S, Smith NJ, Kern R, Picus M, Hoyer S, van Kerkwijk MH, Brett M, Haldane A, del R’ıo JF, Wiebe M, Peterson P, G’erard-Marchant P, Sheppard K, Reddy T, Weckesser W, Abbasi H, Gohlke C, Oliphant TE. Array programming with NumPy. Nature. 2020; 585(7825): 357362. DOI: 10.1038/s41586-020-2649-2
  17. Virtanen P, Gommers R, Oliphant TE, Haberland M, Reddy T, Cournapeau D, Burovski E, Peterson P, Weckesser W, Bright J, van der Walt SJ, Brett M, Wilson J, Millman KJ, Mayorov N, Nelson ARJ, Jones E, Kern R, Larson E, Carey CJ, Polat İ, Feng Y, Moore EW, VanderPlas J, Laxalde D, Perktold J, Cimrman R, Henriksen I, Quintero EA, Harris CR, Archibald AM, Ribeiro AH, Pedregosa F, van Mulbregt P, SciPy 1.0 Contributors. SciPy 1.0: Fundamental Algorithms for Scientific Computing in Python. Nature Methods. 2020; 17: 261272. DOI: 10.1038/s41592-019-0686-2
  18. Hunter JD. Matplotlib: A 2D graphics environment. Computing in Science & Engineering. 2007; 9(3): 9095. DOI: 10.1109/MCSE.2007.55
  19. Shamir O, Sabato S, Tishby N. Learning and generalization with the information bottleneck. Theoretical Computer Science. 2010; 411(29–30): 26962711. DOI: 10.1016/j.tcs.2010.04.006
  20. Pica G, Piasini E, Chicharro D, Panzeri S. Invariant Components of Synergy, Redundancy, and Unique Information among Three Variables. Entropy. 2017; 19(9):–. ISSN 1099-4300. DOI: 10.3390/e19090451
  21. Lizier J, Bertschinger N, Jost J, Wibral M. Information Decomposition of Target Effects from Multi-Source Interactions: Perspectives on Previous, Current and Future Work. Entropy. 2018; 20(4): 307. DOI: 10.3390/e20040307
  22. Pica G, Piasini E, Safaai H, Runyan C, Harvey C, Diamond M, Kayser C, Fellin T, Panzeri S. Quantifying how much sensory information in a neural code is relevant for behavior. In Advances in Neural Information Processing Systems 30. 2017.
DOI: https://doi.org/10.5334/jors.322 | Journal eISSN: 2049-9647
Language: English
Submitted on: Feb 4, 2020
Accepted on: May 13, 2021
Published on: May 31, 2021
Published by: Ubiquity Press
In partnership with: Paradigm Publishing Services
Publication frequency: 1 issue per year

© 2021 Eugenio Piasini, Alexandre L. S. Filipowicz, Jonathan Levine, Joshua I. Gold, published by Ubiquity Press
This work is licensed under the Creative Commons Attribution 4.0 License.