Graphs are everywhere! In anatomy and biology, they appear as transportation systems for air, water, nutrients, or signals, and are found both on the large scale of arteries and airways, and on the small scale of neurons in the brain. The structure, geometry and state of the networks affect their function, and therefore also the health of nearby tissue. Conversely, the state of surrounding tissue also affects the networks, making them both first and second order reporters of health, disease and dysfunction. As a consequence, networks are studied extensively in both biology and medicine — and as a proxy for these, in imaging.
In this talk we first discuss a well known space of graphs, where networks are modelled as equivalence classes of adjacency matrices modulo the action of the node permutation group. We derive geometric properties of this space and discuss the implications of those geometric properties for statistics such as dimensionality reduction and graph-valued regression. Next, we discuss the potential for carrying these geometric insights with us into the realm of deep learning on graphs.