BACK TO INDEX

Publications of Eduardo D. Sontag jointly with W. Maass
Articles in journal or book chapters
  1. W. Maass, P. Joshi, and E.D. Sontag. Computational aspects of feedback in neural circuits. PLoS Computational Biology, 3:e165 1-20, 2007. [PDF] Keyword(s): machine learning, neural networks, feedback linearization, computation by cortical microcircuits, fading memory.
    Abstract:
    It had previously been shown that generic cortical microcircuit models can perform complex real-time computations on continuous input streams, provided that these computations can be carried out with a rapidly fading memory. We investigate in this article the computational capability of such circuits in the more realistic case where not only readout neurons, but in addition a few neurons within the circuit have been trained for specific tasks. This is essentially equivalent to the case where the output of trained readout neurons is fed back into the circuit. We show that this new model overcomes the limitation of a rapidly fading memory. In fact, we prove that in the idealized case without noise it can carry out any conceivable digital or analog computation on time-varying inputs. But even with noise the resulting computational model can perform a large class of biologically relevant real-time computations that require a non-fading memory.


  2. W. Maass, P. Joshi, and E.D. Sontag. Principles of real-time computing with feedback applied to cortical microcircuit models. In Advances in Neural Information Processing Systems 18. MIT Press, Cambridge, 2006. [PDF] Keyword(s): neural networks.
    Abstract:
    The network topology of neurons in the brain exhibits an abundance of feedback connections, but the computational function of these feedback connections is largely unknown. We present a computational theory that characterizes the gain in computational power achieved through feedback in dynamical systems with fading memory. It implies that many such systems acquire through feedback universal computational capabilities for analog computing with a non-fading memory. In particular, we show that feedback enables such systems to process time-varying input streams in diverse ways according to rules that are implemented through internal states of the dynamical system. In contrast to previous attractor-based computational models for neural networks, these flexible internal states are high-dimensional attractors of the circuit dynamics, that still allow the circuit state to absorb new information from online input streams. In this way one arrives at novel models for working memory, integration of evidence, and reward expectation in cortical circuits. We show that they are applicable to circuits of conductance-based Hodgkin-Huxley (HH) neurons with high levels of noise that reflect experimental data on invivo conditions.


  3. W. Maass and E.D. Sontag. Neural Systems as Nonlinear Filters. Neural Comput., 12(8):1743-1772, 2000. [PDF] [doi:http://dx.doi.org/10.1162/089976600300015123] Keyword(s): neural networks, Volterra series.
    Abstract:
    We analyze computations on temporal patterns and spatio-temporal patterns in formal network models whose temporal dynamics arises from empirically established quantitative models for short term dynamics at biological synapses. We give a complete characterization of all linear and nonlinear filters that can be approximated by such dynamic network models: it is the class of all filters that can be approximated by Volterra series. This characterization is shown to be rather stable with regard to changes in the model. For example it is shown that synaptic facilitation and one layer of neurons suffices for approximating arbitrary filters from this class. Our results provide a new complexity hierarchy for all filters that are approximable by Volterra series, which appears to be closer related to the actual cost of implementing such filters in neural hardware than preceding complexity measures. Our results also provide a new parameterization for approximations to such filters in terms of parameters that are arguable related to those that are tunable in biological neural systems.


  4. W. Maass and E.D. Sontag. Analog neural nets with Gaussian or other common noise distributions cannot recognize arbitrary regular languages. Neural Comput., 11(3):771-782, 1999. [PDF] [doi:http://dx.doi.org/10.1162/089976699300016656] Keyword(s): machine learning, neural networks.
    Abstract:
    We consider recurrent analog neural nets where the output of each gate is subject to Gaussian noise, or any other common noise distribution that is nonzero on a large set. We show that many regular languages cannot be recognized by networks of this type, and we give a precise characterization of those languages which can be recognized. This result implies severe constraints on possibilities for constructing recurrent analog neural nets that are robust against realistic types of analog noise. On the other hand we present a method for constructing feedforward analog neural nets that are robust with regard to analog noise of this type.


  5. W. Maass, G. Schnitger, and E.D. Sontag. A comparison of the computational power of sigmoid and Boolean threshold circuits. In V. P. Roychowdhury, Siu K. Y., and Orlitsky A., editors, Theoretical Advances in Neural Computation and Learning, pages 127-151. Kluwer Academic Publishers, 1994. [PDF] Keyword(s): machine learning, neural networks, boolean systems.
    Abstract:
    We examine the power of constant depth circuits with sigmoid threshold gates for computing boolean functions. It is shown that, for depth 2, constant size circuits of this type are strictly more powerful than constant size boolean threshold circuits (i.e. circuits with linear threshold gates). On the other hand it turns out that, for any constant depth d, polynomial size sigmoid threshold circuits with polynomially bounded weights compute exactly the same boolean functions as the corresponding circuits with linear threshold gates.


Conference articles
  1. T. Natschläger, W. Maass, E.D. Sontag, and A. Zador. Processing of time series by neural circuits with biologically realistic synaptic dynamics. In Todd K. Leen, T. G. Dietterich, and V. Tresp, editors, Advances in Neural Information Processing Systems 13 (NIPS2000), pages 145-151, 2000. MIT Press, Cambridge. [PDF] Keyword(s): neural networks, Volterra series.
    Abstract:
    Experimental data show that biological synapses are dynamic, i.e., their weight changes on a short time scale by several hundred percent in dependence of the past input to the synapse. In this article we explore the consequences that this synaptic dynamics entails for the computational power of feedforward neural networks. It turns out that even with just a single hidden layer such networks can approximate a surprisingly large large class of nonlinear filters: all filters that can be characterized by Volterra series. This result is robust with regard to various changes in the model for synaptic dynamics. Furthermore we show that simple gradient descent suffices to approximate a given quadratic filter by a rather small neural system with dynamic synapses.


  2. W. Maass and E.D. Sontag. A precise characterization of the class of languages recognized by neural nets under Gaussian and other common noise distributions. In Proceedings of the 1998 conference on Advances in neural information processing systems II, Cambridge, MA, USA, pages 281-287, 1999. MIT Press. [PDF] Keyword(s): machine learning, neural networks.


  3. W. Maass, G. Schnitger, and E.D. Sontag. On the computational power of sigmoid versus Boolean threshold circuits (extended abstract). In Proceedings of the 32nd annual symposium on Foundations of computer science, Los Alamitos, CA, USA, pages 767-776, 1991. IEEE Computer Society Press. Keyword(s): machine learning, neural networks, theory of computing and complexity.



BACK TO INDEX




Disclaimer:

This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders.




Last modified: Fri Sep 20 11:51:28 2024
Author: sontag.


This document was translated from BibTEX by bibtex2html