Publications about 'critical points' |
Articles in journal or book chapters |
This paper deals with nonlinear least-squares problems involving the fitting to data of parameterized analytic functions. For generic regression data, a general result establishes the countability, and under stronger assumptions finiteness, of the set of functions giving rise to critical points of the quadratic loss function. In the special case of what are usually called "single-hidden layer neural networks", which are built upon the standard sigmoidal activation tanh(x) or equivalently 1/(1+exp(-x)), a rough upper bound for this cardinality is provided as well. |
Conference articles |
Internal reports |
The following observation must surely be "well-known", but it seems worth giving a simple and quite explicit proof. Take any finite subset X of Rn, n>1. Then, there is a polynomial function P:Rn -> R which has local minima on the set X, and has no other critical points. Applied to the negative gradient flow of P, this implies that there is a polynomial vector field with asymptotically stable equilibria on X and no other equilibria. Some trajectories of this vector field are not pre-compact; a complementary observation says that, again for arbitrary X, one can find a vector field with asymptotically stable equilibria on X, no other equilibria except saddles, and all omega-limit sets consisting of singletons. |
This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders.
This document was translated from BibT_{E}X by bibtex2html