Speaker
Description
Humans can separately recognize individual sources when they sense their mixture. We have previously developed error-gated Hebbian rule
(EGHR) for neural networks that achieves independent component analysis (ICA). The EGHR approximately maximizes the information flow through the network by updating synaptic strength using local information available at each synapse, which is also suitable for neuromorphic engineering. The update is described by the product of the presynaptic activity, the postsynaptic activity, and a global factor. If the number of sensors is higher than that of sources, the EGHR can perform dimensionality reduction in some useful ways in addition to simple ICA. First, how the sources are mixed can be dependent on the context. The EGHR can solve this multi-context ICA problem by extracting low-dimensional sources from the high-dimensional sensory inputs. Second, if the input dimensionality is much higher than the source dimensionality, the EGHR can accurately perform nonlinear ICA. I discuss an application of this nonlinear ICA technique for predictive coding of dynamic sources.