Speaker
Description
The information bottleneck (IB), an abstract mathematical framework for compressing relevant information, has attracted some attention recently due to newly discovered relations with deep learning. Specifically, state-of-the-art deep learning approaches now enable us to access IB quantities numerically. Having this new tool at our disposal, it is interesting to explore its relationship with the Renormalization Group (RG) where an apriori different notion of relevant information exists--- that of relevant operators. In a related manner, IB shows promise as an automated method of identifying relevant/slow degrees of freedom in complex interacting models. In this talk, I'll introduce the concept of IB and then report some of the progress we made on these theoretical and applicative fronts. I'll describe a concrete dictionary between relevant operators and bifurcation points in IB compression. In addition, I'll report some recent applications of this approach to self-dual criticality in 3 dimensions.