Publications about 'fading memory'
Articles in journal or book chapters
  1. W. Maass, P. Joshi, and E.D. Sontag. Computational aspects of feedback in neural circuits. PLoS Computational Biology, 3:e165 1-20, 2007. [PDF] Keyword(s): machine learning, neural networks, feedback linearization, computation by cortical microcircuits, fading memory.
    It had previously been shown that generic cortical microcircuit models can perform complex real-time computations on continuous input streams, provided that these computations can be carried out with a rapidly fading memory. We investigate in this article the computational capability of such circuits in the more realistic case where not only readout neurons, but in addition a few neurons within the circuit have been trained for specific tasks. This is essentially equivalent to the case where the output of trained readout neurons is fed back into the circuit. We show that this new model overcomes the limitation of a rapidly fading memory. In fact, we prove that in the idealized case without noise it can carry out any conceivable digital or analog computation on time-varying inputs. But even with noise the resulting computational model can perform a large class of biologically relevant real-time computations that require a non-fading memory.

  2. W. Maass, P. Joshi, and E.D. Sontag. Principles of real-time computing with feedback applied to cortical microcircuit models. In Advances in Neural Information Processing Systems 18. MIT Press, Cambridge, 2006. [PDF] Keyword(s): neural networks.
    The network topology of neurons in the brain exhibits an abundance of feedback connections, but the computational function of these feedback connections is largely unknown. We present a computational theory that characterizes the gain in computational power achieved through feedback in dynamical systems with fading memory. It implies that many such systems acquire through feedback universal computational capabilities for analog computing with a non-fading memory. In particular, we show that feedback enables such systems to process time-varying input streams in diverse ways according to rules that are implemented through internal states of the dynamical system. In contrast to previous attractor-based computational models for neural networks, these flexible internal states are high-dimensional attractors of the circuit dynamics, that still allow the circuit state to absorb new information from online input streams. In this way one arrives at novel models for working memory, integration of evidence, and reward expectation in cortical circuits. We show that they are applicable to circuits of conductance-based Hodgkin-Huxley (HH) neurons with high levels of noise that reflect experimental data on invivo conditions.

  3. M. A. Dahleh, E.D. Sontag, D. N. C. Tse, and J. N. Tsitsiklis. Worst-case identification of nonlinear fading memory systems. Automatica, 31(3):503-508, 1995. [PDF] [doi:] Keyword(s): information-based complexity, fading-memory systems, stability, system identification, structured uncertainty.
    We consider the problem of characterizing possible supply functions for a given dissipative nonlinear system, and provide a result that allows some freedom in the modification of such functions.

Conference articles
  1. M.A. Dahleh, E.D. Sontag, D.N.C. Tse, and J.N. Tsitsiklis. Worst-case identification of nonlinear fading memory systems. In Proc. Amer. Automatic Control Conf., Chicago, June 1992, pages 241-245, 1992. [PDF] Keyword(s): information-based complexity, fading-memory systems, stability, system identification, structured uncertainty.
    Preliminary version of paper published in Automatica in 1995.



This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders.

Last modified: Wed Apr 17 19:59:02 2024
Author: sontag.

This document was translated from BibTEX by bibtex2html