BACK TO INDEX

Publications by Eduardo D. Sontag in year 1996
Books and proceedings
  1. R. Alur, T.A. Henzinger, and E.D. Sontag. Hybrid Systems III. Verification and Control (edited book). Springer Verlag, Berlin, 1996. Note: (LNCS 1066).


Articles in journal or book chapters
  1. E.D. Sontag. Interconnected automata and linear systems: a theoretical framework in discrete-time. In R. Alur, T.A. Henzinger, and E.D. Sontag, editors, Proceedings of the DIMACS/SYCON workshop on Hybrid systems III : verification and control, pages 436-448. Springer-Verlag New York, Inc., Secaucus, NJ, USA, 1996. [PDF] Keyword(s): hybrid systems.
    Abstract:
    This paper summarizes the definitions and several of the main results of an approach to hybrid systems, which combines finite automata and linear systems, developed by the author in the early 1980s. Some related more recent results are briefly mentioned as well.


  2. E.D. Sontag and H.J. Sussmann. General classes of control-Lyapunov functions. In Stability theory (Ascona, 1995), volume 121 of Internat. Ser. Numer. Math., pages 87-96. Birkhäuser, Basel, 1996. [PDF] Keyword(s): control-Lyapunov functions.
    Abstract:
    Shorter and more expository version of "Nonsmooth control-Lyapunov functions"


  3. B. DasGupta and E.D. Sontag. Sample complexity for learning recurrent perceptron mappings. IEEE Trans. Inform. Theory, 42(5):1479-1487, 1996. [PDF] Keyword(s): machine learning, neural networks, VC dimension, recurrent neural networks.
    Abstract:
    Recurrent perceptron classifiers generalize the usual perceptron model. They correspond to linear transformations of input vectors obtained by means of "autoregressive moving-average schemes", or infinite impulse response filters, and allow taking into account those correlations and dependences among input coordinates which arise from linear digital filtering. This paper provides tight bounds on sample complexity associated to the fitting of such models to experimental data. The results are expressed in the context of the theory of probably approximately correct (PAC) learning.


  4. Y. Lin, E.D. Sontag, and Y. Wang. A smooth converse Lyapunov theorem for robust stability. SIAM J. Control Optim., 34(1):124-160, 1996. [PDF] [doi:http://dx.doi.org/10.1137/S0363012993259981] Keyword(s): input to state stability.
    Abstract:
    This paper presents a Converse Lyapunov Function Theorem motivated by robust control analysis and design. Our result is based upon, but generalizes, various aspects of well-known classical theorems. In a unified and natural manner, it (1) allows arbitrary bounded time-varying parameters in the system description, (2) deals with global asymptotic stability, (3) results in smooth (infinitely differentiable) Lyapunov functions, and (4) applies to stability with respect to not necessarily compact invariant sets.


  5. W. Liu, Y. Chitour, and E.D. Sontag. On finite-gain stabilizability of linear systems subject to input saturation. SIAM J. Control Optim., 34(4):1190-1219, 1996. [PDF] [doi:http://dx.doi.org/10.1137/S0363012994263469] Keyword(s): saturation, bounded inputs.
    Abstract:
    This paper deals with (global) finite-gain input/output stabilization of linear systems with saturated controls. For neutrally stable systems, it is shown that the linear feedback law suggested by the passivity approach indeed provides stability, with respect to every Lp-norm. Explicit bounds on closed-loop gains are obtained, and they are related to the norms for the respective systems without saturation. These results do not extend to the class of systems for which the state matrix has eigenvalues on the imaginary axis with nonsimple (size >1) Jordan blocks, contradicting what may be expected from the fact that such systems are globally asymptotically stabilizable in the state-space sense; this is shown in particular for the double integrator.


  6. E.D. Sontag. Critical points for least-squares problems involving certain analytic functions, with applications to sigmoidal nets. Adv. Comput. Math., 5(2-3):245-268, 1996. [PDF] Keyword(s): machine learning, subanalytic sets, semianalytic sets, critical points, approximation theory, neural networks, real-analytic functions.
    Abstract:
    This paper deals with nonlinear least-squares problems involving the fitting to data of parameterized analytic functions. For generic regression data, a general result establishes the countability, and under stronger assumptions finiteness, of the set of functions giving rise to critical points of the quadratic loss function. In the special case of what are usually called "single-hidden layer neural networks", which are built upon the standard sigmoidal activation tanh(x) or equivalently 1/(1+exp(-x)), a rough upper bound for this cardinality is provided as well.


  7. E.D. Sontag and Y. Wang. New characterizations of input-to-state stability. IEEE Trans. Automat. Control, 41(9):1283-1294, 1996. [PDF] Keyword(s): input to state stability, ISS.
    Abstract:
    We present new characterizations of the Input to State Stability property. As a consequence of these results, we show the equivalence between the ISS property and several (apparent) variations proposed in the literature.


Conference articles
  1. F.H. Clarke, Y.S. Ledyaev, E.D. Sontag, and A.I. Subbotin. Asymptotic controllability and feedback stabilization. In Proc. Conf. on Information Sciences and Systems (CISS 96)Princeton, NJ, pages 1232-1237, 1996. Keyword(s): control-Lyapunov functions, feedback stabilization.


  2. B. Dasgupta and E.D. Sontag. Sample complexity for learning recurrent perceptron mappings. In D.S. Touretzky, M.C. Moser, and M.E. Hasselmo, editors, Advances in Neural Information Processing Systems 8, pages 204-210, 1996. MIT Press, Cambridge, MA. Keyword(s): machine learning, neural networks, VC dimension, recurrent neural networks.


  3. P. Koiran and E.D. Sontag. Neural networks with quadratic VC dimension. In D.S. Touretzky, M.C. Moser, and M.E. Hasselmo, editors, Advances in Neural Information Processing Systems 8, pages 197-203, 1996. MIT Press, Cambridge, MA. Keyword(s): machine learning, neural networks, VC dimension.


  4. E.D. Sontag and Y. Wang. Detectability of nonlinear systems. In Proc. Conf. on Information Sciences and Systems (CISS 96), Princeton, NJ, pages 1031-1036, 1996. [PDF] Keyword(s): detectability, input to state stability, ISS.
    Abstract:
    Contains a proof of a technical step, which was omitted from the journal paper due to space constraints


Internal reports
  1. E.D. Sontag and F.R. Wirth. Remarks on universal nonsingular controls for discrete-time systems. Technical report 381, Institute for Dynamical Systems, University of Bremen, 1996.



BACK TO INDEX




Disclaimer:

This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders.




Last modified: Wed Oct 30 12:09:15 2024
Author: sontag.


This document was translated from BibTEX by bibtex2html