1. Introduction. Range of problems for information theory. Communication and Information.
a. Gleick: Information: Prologue, Chapters 1,6,7; optional: Chapter 8,9,13.
b. Additional readings/tutorials on probability theory and logarithms.
2. Mathematical bases of information theory: entropy, conditional entropy, mutual information
Cover, T. M., Thomas, J. A. (2006). “Elements of Information Theory”, New York: Wiley. Chapter 2.
3. Mathematical bases of information theory: relative entropy, divergences & data processing inequality
a. Cover, T. M., Thomas, J. A. (2006). “Elements of Information Theory”, New York: Wiley; Chapter 1&2.
b. Kullback-Leibler divergence: formulation & applications (presenter, discussion leader)
c. Data Processing Inequality: principle & applications (presenter, discussion leader)
d. Sufficient statistics and Maximum Entropy Principle
4. Foundations of Information Theory
a. Weaver, W. Recent contributions to the mathematical theory of communication. & Shannon, C. “The Mathematical Theory of Communication” (selected fragments). In: Shannon, C. & Weaver, In: The Mathematical Theory of Communication. The University of Illinois Press: Urbana. (presenters, discussion leaders)
b. Brillouin, L. (1969). Nauka a Teoria Informacji. Rozdział 1.
5. Applications: Neurobiology
Ta. ononi, G.; Edelman, G.M.; Sporns, O. Complexity and coherency: Integrating information in the brain. Trends Cognitive Science 1998, 2, 474–484 (fragments - till p. 480, without “Reconciling information processing and information storage: matching complexity”)
(presenters, discussion leaders)
6. Applications: Language
a. Zipf, G. K. (1964). „The Psychobiology of Language: an introduction to dynamic Philology.” Rozdział II „The Form and Behavior of Words” (recommended)
b. Piantadosi, S.T. “2014 Zipf ’s word frequency law in natural language: A critical review and future directions” Psychon Bull Rev (presenter, discussion leader)
c. Coupé et al. (2019). Different languages, similar encoding efficiency: Comparable information rates across the human communicative niche. Science Advances, Vol. 5, no. 9, eaaw2594, DOI: 10.1126/sciadv.aaw2594 (presenter, discussion leader)
7. Applications: Networks & structure
a. Introduction to Networks
b. Klein, B., & Hoel, E. (2020). The Emergence of Informative Higher Scales in Complex Networks. Complexity, 2020, 1–12.
(presenter, discussion leader)
8. Algorithmic information theory
a. Cover, T. M., Thomas, J. A. (2006). “Elements of Information Theory”, Chapter 14 (fragments).
b. Intro to algorithmic information theory
9. Complexity Measures I
a. Soler-Toscano, F., Zenil, H., Delahaye, J.-P., & Gauvrit, N. (2014). Calculating Kolmogorov Complexity from the Output Frequency Distributions of Small Turing Machines. PLoS ONE, 9(5), e96223. https://doi.org/10.1371/journal.pone.0096223
(presenter, discussion leader)
Gauvrit, N., Zenil, H., Soler-Toscano, F., Delahaye, J.-P., & Brugger, P. (2017). Human behavioral complexity peaks at age 25. PLOS Computational Biology, 13(4), e1005408. https://doi.org/10.1371/journal.pcbi.1005408
(1 presenter, 1 discussion leader)
10. Complexity measures II: Information, energy, cognition
a. Lloyd & Pagels, 1988, Complexity as Thermodynamic Depth, ANNALS OF PHYSICS 188, str. 186-191 (recommended).
b. Klamut, Kutner & Struzik, 2020, Towards a universal measure of complexity, Entropy (recommended)
c. Deacon, T. & Koutroufinis, S. (2014). Complexity and Dynamical Depth. Information, 5, 404-423. (presenters, discussion leader)
11. Can Shannon information be a basis for semantic information?
a. Hasselman, F. (2022). Radical embodied computation: Emergence of meaning through the reproduction of similarity by analogy (...)) (recommended)
b. Isaac, A. (2019) The Semantics Latent in Shannon Information.
(presenters, discussion leader)
12 – 14. Project presentations @ mini conference