BACK TO INDEX

Publications by Eduardo D. Sontag in year 1998
Books and proceedings
  1. E.D. Sontag. Mathematical Control Theory. Deterministic Finite-Dimensional Systems, volume 6 of Texts in Applied Mathematics. Springer-Verlag, New York, Second edition, 1998. [PDF]
    Abstract:
    This book is copyrighted by Springer-Verlag. Springer has kindly allowed me to place a copy on the web, as a reference and for ease of web searches. Please consider buying your own hardcopy.


Articles in journal or book chapters
  1. E.D. Sontag. A general approach to path planning for systems without drift. In J. Baillieul, S. S. Sastry, and H.J. Sussmann, editors, Essays on mathematical robotics (Minneapolis, MN, 1993), volume 104 of IMA Vol. Math. Appl., pages 151-168. Springer, New York, 1998. [PDF] Keyword(s): path-planning, systems without drift, nonlinear control, controllability, real-analytic functions.
    Abstract:
    This paper proposes a generally applicable technique for the control of analytic systems with no drift. The method is based on the generation of "nonsingular loops" that allow linearized controllability. One can then implement Newton and/or gradient searches in the search for a control. A general convergence theorem is proved.


  2. E.D. Sontag. Automata and neural networks. In The handbook of brain theory and neural networks, pages 119-122. MIT Press, Cambridge, MA, USA, 1998. [PDF] Keyword(s): neural networks.


  3. E.D. Sontag. VC dimension of neural networks. In C.M. Bishop, editor, Neural Networks and Machine Learning, pages 69-95. Springer, Berlin, 1998. [PDF] Keyword(s): machine learning, VC dimension, learning, neural networks, shattering.
    Abstract:
    The Vapnik-Chervonenkis (VC) dimension is an integer which helps to characterize distribution-independent learning of binary concepts from positive and negative samples. This paper, based on lectures delivered at the Isaac Newton Institute in August of 1997, presents a brief introduction, establishes various elementary results, and discusses how to estimate the VC dimension in several examples of interest in neural network theory. (It does not address the learning and estimation-theoretic applications of VC dimension, and the applications to uniform convergence theorems for empirical probabilities, for which many suitable references are available.)


  4. P. Koiran and E.D. Sontag. Vapnik-Chervonenkis dimension of recurrent neural networks. Discrete Appl. Math., 86(1):63-79, 1998. [PDF] [doi:http://dx.doi.org/10.1016/S0166-218X(98)00014-6] Keyword(s): machine learning, neural networks, recurrent neural networks.
    Abstract:
    This paper provides lower and upper bounds for the VC dimension of recurrent networks. Several types of activation functions are discussed, including threshold, polynomial, piecewise-polynomial and sigmoidal functions. The bounds depend on two independent parameters: the number w of weights in the network, and the length k of the input sequence. Ignoring multiplicative constants, the main results say roughly the following: 1. For architectures whose activation is any fixed nonlinear polynomial, the VC dimension is proportional to wk. 2. For architectures whose activation is any fixed piecewise polynomial, the VC dimension is between wk and w**2k. 3. For architectures with threshold activations, the VC dimension is between wlog(k/w) and the smallest of wklog(wk) and w**2+wlog(wk). 4. For the standard sigmoid tanh(x), the VC dimension is between wk and w**4 k**2.


  5. D. Nesic and E.D. Sontag. Input-to-state stabilization of linear systems with positive outputs. Systems Control Lett., 35(4):245-255, 1998. [PDF] Keyword(s): input to state stability, ISS, stabilization.
    Abstract:
    This paper considers the problem of stabilization of linear systems for which only the magnitudes of outputs are measured. It is shown that, if a system is controllable and observable, then one can find a stabilizing controller, which is robust with respect to observation noise (in the ISS sense).


  6. E.D. Sontag. A learning result for continuous-time recurrent neural networks. Systems Control Lett., 34(3):151-158, 1998. [PDF] [doi:http://dx.doi.org/10.1016/S0167-6911(98)00006-1] Keyword(s): machine learning, neural networks, VC dimension, recurrent neural networks.
    Abstract:
    The following learning problem is considered, for continuous-time recurrent neural networks having sigmoidal activation functions. Given a ``black box'' representing an unknown system, measurements of output derivatives are collected, for a set of randomly generated inputs, and a network is used to approximate the observed behavior. It is shown that the number of inputs needed for reliable generalization (the sample complexity of the learning problem) is upper bounded by an expression that grows polynomially with the dimension of the network and logarithmically with the number of output derivatives being matched.


  7. E.D. Sontag. Comments on integral variants of ISS. Systems Control Lett., 34(1-2):93-100, 1998. [PDF] [doi:http://dx.doi.org/10.1016/S0167-6911(98)00003-6] Keyword(s): input to state stability, integral input to state stability, iISS, ISS.
    Abstract:
    This note discusses two integral variants of the input-to-state stability (ISS) property, which represent nonlinear generalizations of L2 stability, in much the same way that ISS generalizes L-infinity stability. Both variants are equivalent to ISS for linear systems. For general nonlinear systems, it is shown that one of the new properties is strictly weaker than ISS, while the other one is equivalent to it. For bilinear systems, a complete characterization is provided of the weaker property. An interesting fact about functions of type KL is proved as well.


  8. E.D. Sontag and F.R. Wirth. Remarks on universal nonsingular controls for discrete-time systems. Systems Control Lett., 33(2):81-88, 1998. [PDF] [doi:http://dx.doi.org/10.1016/S0167-6911(97)00117-5] Keyword(s): discrete time, controllability, real-analytic functions.
    Abstract:
    For analytic discrete-time systems, it is shown that uniform forward accessibility implies the generic existence of universal nonsingular control sequences. A particular application is given by considering forward accessible systems on compact manifolds. For general systems, it is proved that the complement of the set of universal sequences of infinite length is of the first category. For classes of systems satisfying a descending chain condition, and in particular for systems defined by polynomial dynamics, forward accessibility implies uniform forward accessibility.


Conference articles
  1. D. Angeli, E.D. Sontag, and Y. Wang. A remark on integral input to state stability. In Proc. IEEE Conf. Decision and Control, Tampa, Dec. 1998, IEEE Publications, 1998, pages 2491-2496, 1998. Keyword(s): input to state stability.


  2. X. Bao, Z. Lin, and E.D. Sontag. Some new results on finite gain $l_p$ stabilization of discrete-time linear systems subject to actuator saturation. In Proc. IEEE Conf. Decision and Control, Tampa, Dec. 1998, IEEE Publications, 1998, pages 4628-4629, 1998. Keyword(s): saturation, bounded inputs.


  3. B. Dasgupta and E.D. Sontag. A polynomial-time algorithm for an equivalence problem which arises in hybrid systems theory. In Proc. IEEE Conf. Decision and Control, Tampa, Dec. 1998, IEEE Publications, 1998, pages 1629-1634, 1998.


  4. M. Krichman and E.D. Sontag. A version of a converse Lyapunov theorem for input-output to state stability. In Proc. IEEE Conf. Decision and Control, Tampa, Dec. 1998, IEEE Publications, 1998, pages 4121-4126, 1998. Keyword(s): input to state stability.


  5. P. Kuusela, D. Ocone, and E.D. Sontag. On the VC dimension of continuous-time linear control systems. In Proc. 32nd Annual Conf. on Information Sciences and Systems (CISS 98), Princeton, NJ, pages 795-800, 1998.


  6. Y.S. Ledyaev and E.D. Sontag. Stabilization under measurement noise: Lyapunov characterization. In Proc. American Control Conf., Philadelphia, June 1998, pages 1658-166, 1998.


  7. D. Nesic and E.D. Sontag. Output stabilization of nonlinear systems: Linear systems with positive outputs as a case study. In Proc. IEEE Conf. Decision and Control, Tampa, Dec. 1998, IEEE Publications, 1998, pages 885-890, 1998.


  8. E.D. Sontag. Notions of integral input-to-state stability. In Proc. American Control Conf., Philadelphia, June 1998, pages 3215-321, 1998. Keyword(s): input to state stability, integral input to state stability, iISS, ISS.


  9. E.D. Sontag. Recent results on discontinuous stabilization and control-Lyapunov functions. In Proc. Workshop on Control of Nonlinear and Uncertain Systems, London, Feb. 1998, 1998. Keyword(s): control-Lyapunov functions.


  10. E.D. Sontag and Y. Qiao. Remarks on controllability of recurrent neural networks. In Proc. IEEE Conf. Decision and Control, Tampa, Dec. 1998, IEEE Publications, 1998, pages 501-506, 1998. Keyword(s): machine learning, neural networks, recurrent neural networks.



BACK TO INDEX




Disclaimer:

This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders.




Last modified: Wed Apr 17 19:59:02 2024
Author: sontag.


This document was translated from BibTEX by bibtex2html