| ![]() |
ON THE PROBLEM OF LOCAL MINIMA IN BACKPROPAGATION
M. Gori and A. Tesi
Dipartimento di Sistemi e Informatica, Universit?a di Firenze
Via di Santa Marta 3 - 50139 Firenze - Italy
Tel. (39) 55-4796265 - Fax (39) 55-4796363
e-mail : [email protected]
Abstract
Supervised Learning in Multi-Layered Neural Networks (MLNs) has been recently proposed through the well-known Backpropagation algorithm. This is a gradient method which can get stuck in local minima, as simple examples can show. In this paper, some conditions on the network architecture and the learning environment are proposed which ensure the convergence of the Backpropagation algorithm. It is proven in particular that the convergence holds if the classes are linearly-separable. In this case, the experience gained in several experiments shows that MLNs exceed perceptrons in generalization to new examples.
Index Terms- Multi-Layered Networks, learning environment, Backpropagation, pattern recognition, linearly-separable classes.