Abstracting spatial prototypes through short-term suppression of Hebbian weights in a continuously changig environment

S. Tavitian, T. Fomin and A. Lõrincz

Neural Network World in press (1997)


Abstract


A step towards assumption-free self-organization is proposed. We address the problem of learning a stable representation of the environment from inputs continuously changing at an unpredictable rate. The vulnerability of competitive Hebbian learning to low rate changes is assessed. It is shown that anti-Hebbian suppression of the feed-forward Hebbian weights broadens the range of rates in which learning is possible, and reduces the influence of the rate on the emerging representation. The resulting robustness during real-time training is demonstrated through simulations, and is compared to an alternative non-synaptic suppression scheme. Some particular passive short-term response properties of high-level visual areas are pointed out as the biological clues for this form of short-term plasticity. The question is raised about a possible stabilizing role of the proposed mechanism in the learning of invariances.


 * * *

<-- Home