Information theory driven construction of effective theories
18.11.2022 14:15 – 15:15
Identifying the relevant degrees of freedom is key to developing an effective theory of a complex system. “Relevance” is well defined in physics within the renormalisation group (RG) program, whose practical execution in unfamiliar systems is, however, difficult. Machine learning approaches, on the other hand, though excelling at feature extraction, lack formal interpretability: it is unclear what relation such architecture- and training-dependent learned "relevant features” bear to objects of physical theory.
I will discuss how to bridge the above gap, paving a path to automated discovery of mathematically formal and interpretable physical theories from raw data. To this end we show that the field-theoretic “relevance” is in fact equivalent* to the notion of “relevant information” defined in the Information Bottleneck (IB) formalism of compression theory. Employing recent tools of ML-based estimation of information-theoretic quantities we then construct an unsupervised algorithm whose inputs are raw configurations of a system, and whose outputs are neural nets parametrising formal objects such as “order parameters”, or more generally “scaling operators”. The information about the phase diagram, correlations and symmetries (also emergent) can be obtained. I will also discuss the recent extension of these tools to quasiperiodic lattices, and mention possible applications of similar ideas to dynamical systems.
Lieu
Bâtiment: Ecole de Physique
Auditoire Stuckelberg
Organisé par
Département de physique théoriqueIntervenant-e-s
Maciej Koch-Janusz, Zurich/Chicagoentrée libre
Plus d'infos
Contact: missing email