For best experience please turn on javascript and use a modern browser!
You are using a browser that is no longer supported by Microsoft. Please upgrade your browser. The site may not present itself correctly if you continue browsing.

It is our pleasure to invite you to attend the lecture series on Topological Data Analysis and Information Theory, organised by IAS fellow Fernando Nobrega Santos and Rick Quax.

Event details of Topological Data Analysis and Information Theory (online)
Date 5 July 2021
Time 14:00 -17:30
Organised by Fernando Nobrega Santos , Rick Quax

High-order interactions are interactions that go beyond a sequence of pairwise interactions. Multiple approaches exist that aim to detect and quantify high-order interactions that are qualitatively different. Two of the most prominent approaches are topological data analysis (TDA) and information theory (IT). Central questions addressed in this lecture series are: what do these two approaches have in common? How can they complement each other? And what could they bring to application domains, especially in neuroscience? 

Programme Monday 5 July 2021

14:00-14:10 Opening by IAS 
14.10-15.10

Lecture by Rick Quax (UvA - IAS)
Title: Brief introduction to information theory and the concept(s) of synergy

15.10-15.20 Break
15:20-16:20 Lecture by Fernando Rosas (Imperial College UK) 
Title: Towards a deeper understanding of high-order interdependencies in complex systems
16.20-16.30 Break
16:30-17:30 Lecture by Pierre Baudot (Median Technologies– France)
Title: Information is Topology

Each lecture will be 50 min, followed by Q&A. To participate, register below.

First lecture

Title: Brief introduction to information theory and the concept(s) of synergy

Speaker: Rick Quax (UvA - IAS) 

Abstract: Quantifying synergy among stochastic variables is an important open problem in information theory. Information synergy occurs when multiple sources together predict an outcome variable better than the sum of single-source predictions. It is an essential phenomenon in biology such as in neuronal networks and cellular regulatory processes, where different information flows integrate to produce a single response, but also in social cooperation processes as well as in statistical inference tasks in machine learning. Here we propose a metric of synergistic entropy and synergistic information from first principles. Rick will start with a brief, need-to-know introduction of key information-theoretic notions (entropy, mutual information) and then move on to introducing the concept of synergistic information. He will highlight a few intuitive ways of trying to quantify synergistic information that exist today, including PID, a geometric approach, and his own proposed method.

Short bio: Rick’s ambition is to study Complex Adaptive Systems with a focus on emergent information processing in dynamical systems. He is trying to span the spectrum from theoretical foundations to application domains, ensuring that new theory insights have direct impact on application-oriented research and vice versa. He is currently Assistant Professor in the Computational Science Lab at the University of Amsterdam and member of IAS.

Second lecture

Title: Towards a deeper understanding of high-order interdependencies in complex systems

Speaker: Fernando Rosas (Imperial College UK) 

Abstract: We live in an increasingly interconnected world and, unfortunately, our understanding of interdependency is still rather limited. As a matter of fact, while bi-variated relationships are at the core of most of our data analysis methods, there is still no principled theory to account for the different types of interactions that can occur between three or more variables. This talk explores the vast and largely unexplored territory of multivariate complexity, and discusses information-theoretic approaches that have been recently introduced to fill this important knowledge gap.

The first part of the talk is devoted to synergistic phenomena, which correspond to statistical regularities that affect the whole but not the parts. We explain how synergy can be effectively captured by information-theoretic measures inspired in the nature of high brain functions, and how these measures allow us to map complex interdependencies into hypergraphs. The second part of the talk focuses on a new theory of what constitutes causal emergence and how it can be measured from time series data. This theory enables a formal, quantitative account of downward causation, and introduces “causal decoupling” as a complementary modality of emergence. Importantly, this not only establishes conceptual tools to frame conjectures about emergence rigorously, but also provides practical procedures to test them on data. We illustrate the considered analysis tools on different case studies, including cellular automata, baroque music, flocking models, and neuroimaging datasets. 

Short Bio: Fernando Rosas received the B.A. degree in music composition and minor degree in philosophy (2002), the B.Sc. degree in mathematics (2006), and the M.S. and Ph.D. degree in engineering sciences from the Pontificia Universidad Católica de Chile (PUC, 2012). He worked as postdoctoral researcher at KU Leuven (Belgium), the National Taiwan University (Taiwan), and Imperial College London (UK). He received the “Academic Award” given by the Department of Mathematics of the PUC for having the best academic performance of his promotion and was the recipient of a CONICYT Doctoral Fellowship from the Chilean Ministry of Education (2008), a “F+” Scholarship from KU Leuven (2014), and a Marie Słodowska-Curie Individual Fellowship from the European Union (2017). He is currently working as Postdoctoral Researcher at the Data Science Institute and the Centre for Psychedelic Research at Imperial College London. His research interests lay in the interface between data science & AI, complexity science, cognitive science, and neuroscience. 

Third Lecture

Title: Information is Topology

Speaker: Pierre Baudot (Median Technologies– France)

Abstract: Information theory, probability and statistical dependencies, and algebraic topology provide different views of a unified theory yet currently in development, where uncertainty goes as deep as Galois's ambiguity theory, topos and motivs. I will review some foundations led notably by Bennequin and Vigneaux, that characterize uniquely entropy as the first group of cohomology, on random variable complexes and probability laws. This framework allows to retrieve most of the usual information functions, like KL divergence, cross entropy, Tsallis entropies, differential entropy in different generality settings. Multivariate interaction/Mutual information (I_k and J_k) appear as coboundaries, and their negative minima, also called synergy, corresponds to homotopical link configurations, which at the image of Borromean links, illustrate what purely collective interactions or emergence can be. Those functions refine and characterize statistical independence in the multivariate case, in the sens that (X1,...,Xn) are independent iff all the I_k=0 (with 1<k<n+1, whereas for Total correlations G_k, it is sufficient that G_n=0), generalizing correlation coefficient. Concerning data analysis, restricting to the simplicial random variable structure sub-case, the application of the formalism to genetic transcription or to some classical benchmark dataset using open access infotopo library, unravels that higher statistical interactions are nonetheless omnipresent but also constitutive of biologically relevant assemblies. On the side of Machine learning, information cohomology provides a topological and combinatorial formalization of deep networks' supervised and unsupervised learning, where the depth of the layers is the simplicial dimension, derivation-propagation is forward (co-homological).

Short bio: Pierre Baudot was graduated in 1998 from Ecole Normale supérieure Ulm magister of biology, and passed his PhD in electrophysiology of visual perception studying learning   information coding in natural condition. He started to develop information topology with Daniel Bennequin at Complex System Institute and Mathematical Institute of Jussieu from 2006 to 2013, and then at the Max Planck Institute for Mathematic in the Science at Leipzig. He then joined Inserm at Marseille to develop data applications notably to transcriptomics. Since 2018, he works at Median Technologies, a medical imaging AI company, to detect and predict cancers from CT scans. He received the K2 trophy (mathematics and applications 2017), and best entropy paper prize 2019 for his contributions to topological information data analysis.