Speaker
Prof.
Erik Aurell
(KTH)
Description
The "Inverse Ising Problem" refers to finding the parameters
(the J_ij's and the h_i's) in an Ising model given the first
and second moments (the magnitizations m_i and the
correclation functions c_ij).
This is of great interest in machine learning and data
analysis whenever the data set and the number of variables
is large, but the values taken by the variables can be taken
to be "high" and "low". The maximum entropy distributions
with given first and second moments then has the Ising form
where the h_i's and J_ij's are Lagrange parameters.
The last years have seen an explosion in interest in
approximate but fast methods borrowed from statistical
mechanics to learn such "maxentropy" models from correlation
data. Some motivations have been e.g. inferring casual
structures underlying observed gene expression, or inferring
functional connectivities between neurons from
multi-neuronal recordings, where measurements from hundreds
of neurons are available today, and millions have been
envisaged.
Although methods borrowed from non-equilibrium may be more
promising in applications, I will describe results using
equilibrium statistical mechanics, and the testing ground
will be mainly the Sherrington-Kirkpatrick spin glass.
The methods discussed are simple mean-field, TAP, and the
"Susceptibility Propagation" introduced by Mezard. One main
message is that all these are sensitive to the accuracy of
the correlation data themselves. There is hence a three-way
trade-off between computability, inference accuracy (given
perfect data), and sensitivity to undersampling of the
correlations.
This is work done or in progess with John Hertz, Yasser
Roudi, Mikko Alava, Hamed Mahmoudi, Aymeric Fouquier
d'Herouel, Jarkko Salojärvi, Zeng Hong-Li and Charles Ollion.
Similar results to ours on Susceptibility Propagation have
been obtained by Enzo Marinari (paper available on arXiv.org).