Uniwersytet Warszawski - Centralny System Uwierzytelniania
Strona główna

Information theory for cognitive sciences 2500-EN-COG-OB2Z-C-1
Seminarium (SEM) Semestr zimowy 2024/25

Informacje o zajęciach (wspólne dla wszystkich grup)

Liczba godzin: 30
Limit miejsc: (brak limitu)
Zaliczenie: Zaliczenie na ocenę
Metody i kryteria oceniania: (tylko po angielsku)

40% Project and its presentation

30% Short paper presentation and guiding the discussion

20% Homework(s)

10% Class presence and active participation

Attendance to the seminar is obligatory, 2 unexcused absences are allowed.

Students must respect the principles of academic integrity. Cheating and plagiarism (including copying work from other students, internet or other sources) are serious violations that are punishable and instructors are required to report all cases to the administration.

Zakres tematów: (tylko po angielsku)

1. Introduction. Range of problems for information theory. Communication and Information.

a. Gleick: Information: Prologue, Chapters 1,6,7; optional: Chapter 8,9,13.

b. Additional readings/tutorials on probability theory and logarithms.

2. Mathematical bases of information theory: entropy, conditional entropy, mutual information

Cover, T. M., Thomas, J. A. (2006). “Elements of Information Theory”, New York: Wiley. Chapter 2.

3. Mathematical bases of information theory: relative entropy, divergences & data processing inequality

a. Cover, T. M., Thomas, J. A. (2006). “Elements of Information Theory”, New York: Wiley; Chapter 1&2.

b. Kullback-Leibler divergence: formulation & applications (presenter, discussion leader)

c. Data Processing Inequality: principle & applications (presenter, discussion leader)

d. Sufficient statistics and Maximum Entropy Principle

4. Foundations of Information Theory

a. Weaver, W. Recent contributions to the mathematical theory of communication. & Shannon, C. “The Mathematical Theory of Communication” (selected fragments). In: Shannon, C. & Weaver, In: The Mathematical Theory of Communication. The University of Illinois Press: Urbana. (presenters, discussion leaders)

b. Brillouin, L. (1969). Nauka a Teoria Informacji. Rozdział 1.

5. Applications: Neurobiology

Ta. ononi, G.; Edelman, G.M.; Sporns, O. Complexity and coherency: Integrating information in the brain. Trends Cognitive Science 1998, 2, 474–484 (fragments - till p. 480, without “Reconciling information processing and information storage: matching complexity”)

(presenters, discussion leaders)

6. Applications: Language

a. Zipf, G. K. (1964). „The Psychobiology of Language: an introduction to dynamic Philology.” Rozdział II „The Form and Behavior of Words” (recommended)

b. Piantadosi, S.T. “2014 Zipf ’s word frequency law in natural language: A critical review and future directions” Psychon Bull Rev (presenter, discussion leader)

c. Coupé et al. (2019). Different languages, similar encoding efficiency: Comparable information rates across the human communicative niche. Science Advances, Vol. 5, no. 9, eaaw2594, DOI: 10.1126/sciadv.aaw2594 (presenter, discussion leader)

7. Applications: Networks & structure

a. Introduction to Networks

b. Klein, B., & Hoel, E. (2020). The Emergence of Informative Higher Scales in Complex Networks. Complexity, 2020, 1–12.

(presenter, discussion leader)

8. Algorithmic information theory

a. Cover, T. M., Thomas, J. A. (2006). “Elements of Information Theory”, Chapter 14 (fragments).

b. Intro to algorithmic information theory

9. Complexity Measures I

a. Soler-Toscano, F., Zenil, H., Delahaye, J.-P., & Gauvrit, N. (2014). Calculating Kolmogorov Complexity from the Output Frequency Distributions of Small Turing Machines. PLoS ONE, 9(5), e96223. https://doi.org/10.1371/journal.pone.0096223

(presenter, discussion leader)

Gauvrit, N., Zenil, H., Soler-Toscano, F., Delahaye, J.-P., & Brugger, P. (2017). Human behavioral complexity peaks at age 25. PLOS Computational Biology, 13(4), e1005408. https://doi.org/10.1371/journal.pcbi.1005408

(1 presenter, 1 discussion leader)

10. Complexity measures II: Information, energy, cognition

a. Lloyd & Pagels, 1988, Complexity as Thermodynamic Depth, ANNALS OF PHYSICS 188, str. 186-191 (recommended).

b. Klamut, Kutner & Struzik, 2020, Towards a universal measure of complexity, Entropy (recommended)

c. Deacon, T. & Koutroufinis, S. (2014). Complexity and Dynamical Depth. Information, 5, 404-423. (presenters, discussion leader)

11. Can Shannon information be a basis for semantic information?

a. Hasselman, F. (2022). Radical embodied computation: Emergence of meaning through the reproduction of similarity by analogy (...)) (recommended)

b. Isaac, A. (2019) The Semantics Latent in Shannon Information.

(presenters, discussion leader)

12 – 14. Project presentations @ mini conference

Metody dydaktyczne: (tylko po angielsku)

The seminar will consist in:

Presentation of information theory main concepts and measures; homework problems will be assigned to test students’ understanding. Problems can be discussed in class.

Students are expected to introduce 1 paper during the class (20 minutes) and initiate and guide the discussion of the paper.

Students, individually or in pairs will conduct a small research project: Students will formulate a research problem, gather simple datasets and use information theory measures to answer research questions.

Grupy zajęciowe

zobacz na planie zajęć

Grupa Termin(y) Prowadzący Miejsca Liczba osób w grupie / limit miejsc Akcje
1 każdy poniedziałek, 10:15 - 11:45, sala 94
Julian Zubek, Szymon Talaga 16/20 szczegóły
Wszystkie zajęcia odbywają się w budynku:
Budynek Dydaktyczny - Stawki 5/7
Opisy przedmiotów w USOS i USOSweb są chronione prawem autorskim.
Właścicielem praw autorskich jest Uniwersytet Warszawski.
ul. Banacha 2
02-097 Warszawa
tel: +48 22 55 44 214 https://www.mimuw.edu.pl/
kontakt deklaracja dostępności mapa serwisu USOSweb 7.1.2.0-a1f734a9b (2025-06-25)