Publications by Eduardo D. Sontag in year 1996 |
Books and proceedings |
Articles in journal or book chapters |
This paper summarizes the definitions and several of the main results of an approach to hybrid systems, which combines finite automata and linear systems, developed by the author in the early 1980s. Some related more recent results are briefly mentioned as well. |
Shorter and more expository version of "Nonsmooth control-Lyapunov functions" |
Recurrent perceptron classifiers generalize the usual perceptron model. They correspond to linear transformations of input vectors obtained by means of "autoregressive moving-average schemes", or infinite impulse response filters, and allow taking into account those correlations and dependences among input coordinates which arise from linear digital filtering. This paper provides tight bounds on sample complexity associated to the fitting of such models to experimental data. The results are expressed in the context of the theory of probably approximately correct (PAC) learning. |
This paper presents a Converse Lyapunov Function Theorem motivated by robust control analysis and design. Our result is based upon, but generalizes, various aspects of well-known classical theorems. In a unified and natural manner, it (1) allows arbitrary bounded time-varying parameters in the system description, (2) deals with global asymptotic stability, (3) results in smooth (infinitely differentiable) Lyapunov functions, and (4) applies to stability with respect to not necessarily compact invariant sets. |
This paper deals with (global) finite-gain input/output stabilization of linear systems with saturated controls. For neutrally stable systems, it is shown that the linear feedback law suggested by the passivity approach indeed provides stability, with respect to every Lp-norm. Explicit bounds on closed-loop gains are obtained, and they are related to the norms for the respective systems without saturation. These results do not extend to the class of systems for which the state matrix has eigenvalues on the imaginary axis with nonsimple (size >1) Jordan blocks, contradicting what may be expected from the fact that such systems are globally asymptotically stabilizable in the state-space sense; this is shown in particular for the double integrator. |
This paper deals with nonlinear least-squares problems involving the fitting to data of parameterized analytic functions. For generic regression data, a general result establishes the countability, and under stronger assumptions finiteness, of the set of functions giving rise to critical points of the quadratic loss function. In the special case of what are usually called "single-hidden layer neural networks", which are built upon the standard sigmoidal activation tanh(x) or equivalently 1/(1+exp(-x)), a rough upper bound for this cardinality is provided as well. |
We present new characterizations of the Input to State Stability property. As a consequence of these results, we show the equivalence between the ISS property and several (apparent) variations proposed in the literature. |
Conference articles |
Contains a proof of a technical step, which was omitted from the journal paper due to space constraints |
Internal reports |
This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders.
This document was translated from BibT_{E}X by bibtex2html