Finite-Time Lyapunov Exponents of Deep Neural Networks

13 Mar 2024, 10:15
45m
Albano 3: 4205 - SU Conference Room (40 seats) (Albano Building 3)

Albano 3: 4205 - SU Conference Room (40 seats)

Albano Building 3

40

Speaker

Kristian Gustavsson (Gothenburg University)

Description

Deep neural networks have recently led to breakthroughs in various fields, ranging from image recognition and natural language processing to autonomous driving and medical diagnosis. Despite these achievements, a comprehensive understanding of their learning process is still lacking. We use parallels with dynamical systems to investigate how small perturbations to the input affect the output of deep neural networks. The growth or decay of the perturbations are characterized by finite-time Lyapunov exponents. We show that the maximal exponent forms geometrical structures in input space, akin to coherent structures in dynamical systems. These ridges visualize the geometry that deep networks construct in input space, shedding light on the fundamental mechanisms underlying their learning process.

Presentation materials

There are no materials yet.