BACK TO INDEX

Publications about 'NeurIPS'
Articles in journal or book chapters
  1. W. Maass, P. Joshi, and E.D. Sontag. Principles of real-time computing with feedback applied to cortical microcircuit models. In Advances in Neural Information Processing Systems 18. MIT Press, Cambridge, 2006. Note: Proc. NIPS(NeurIPS)-18, Vancouver 2005, https://proceedings.neurips.cc/paper/2005. [PDF] Keyword(s): NeurIPS, machine learning, artificial intelligence, neural networks.
    Abstract:
    The network topology of neurons in the brain exhibits an abundance of feedback connections, but the computational function of these feedback connections is largely unknown. We present a computational theory that characterizes the gain in computational power achieved through feedback in dynamical systems with fading memory. It implies that many such systems acquire through feedback universal computational capabilities for analog computing with a non-fading memory. In particular, we show that feedback enables such systems to process time-varying input streams in diverse ways according to rules that are implemented through internal states of the dynamical system. In contrast to previous attractor-based computational models for neural networks, these flexible internal states are high-dimensional attractors of the circuit dynamics, that still allow the circuit state to absorb new information from online input streams. In this way one arrives at novel models for working memory, integration of evidence, and reward expectation in cortical circuits. We show that they are applicable to circuits of conductance-based Hodgkin-Huxley (HH) neurons with high levels of noise that reflect experimental data on invivo conditions.


Conference articles
  1. T. Natschläger, W. Maass, E.D. Sontag, and A. Zador. Processing of time series by neural circuits with biologically realistic synaptic dynamics. In Todd K. Leen, T. G. Dietterich, and V. Tresp, editors, Advances in Neural Information Processing Systems 13 (NIPS2000), pages 145-151, 2000. MIT Press, Cambridge. Note: Proc. NIPS(NeurIPS)-13, Denver, 2000, https://papers.nips.cc/paper_files/paper/2000. [PDF] Keyword(s): NeurIPS, machine learning, artificial intelligence, neural networks, Volterra series.
    Abstract:
    Experimental data show that biological synapses are dynamic, i.e., their weight changes on a short time scale by several hundred percent in dependence of the past input to the synapse. In this article we explore the consequences that this synaptic dynamics entails for the computational power of feedforward neural networks. It turns out that even with just a single hidden layer such networks can approximate a surprisingly large large class of nonlinear filters: all filters that can be characterized by Volterra series. This result is robust with regard to various changes in the model for synaptic dynamics. Furthermore we show that simple gradient descent suffices to approximate a given quadratic filter by a rather small neural system with dynamic synapses.


  2. W. Maass and E.D. Sontag. A precise characterization of the class of languages recognized by neural nets under Gaussian and other common noise distributions. In Proceedings of the 1998 conference on Advances in Neural Information Processing Systems II, Cambridge, MA, USA, pages 281-287, 1999. MIT Press. Note: Proc. NIPS(NeurIPS)-11, Denver, 1998, https://papers.nips.cc/paper_files/paper/1998. [PDF] Keyword(s): NeurIPS, machine learning, artificial intelligence, neural networks.


  3. B. Dasgupta and E.D. Sontag. Sample complexity for learning recurrent perceptron mappings. In D.S. Touretzky, M.C. Moser, and M.E. Hasselmo, editors, Advances in Neural Information Processing Systems 8, pages 204-210, 1996. MIT Press, Cambridge, MA. Note: Proc. NIPS(NeurIPS)-8, Denver, 1995, https://papers.nips.cc/paper_files/paper/1995. Keyword(s): NeurIPS, machine learning, artificial intelligence, neural networks, VC dimension, recurrent neural networks.


  4. P. Koiran and E.D. Sontag. Neural networks with quadratic VC dimension. In D.S. Touretzky, M.C. Moser, and M.E. Hasselmo, editors, Advances in Neural Information Processing Systems 8, pages 197-203, 1996. MIT Press, Cambridge, MA. Note: Proc. NIPS(NeurIPS)-8, Denver, 1995, https://papers.nips.cc/paper_files/paper/1995. Keyword(s): NeurIPS, machine learning, artificial intelligence, neural networks, VC dimension.


  5. E.D. Sontag. Remarks on interpolation and recognition using neural nets. In NIPS-3: Proceedings of the 1990 conference on Advances in neural information processing systems 3, San Francisco, CA, USA, pages 939-945, 1990. Morgan Kaufmann Publishers Inc.. Note: Proc. NIPS(NeurIPS)-3, Denver, 1990, https://papers.nips.cc/paper_files/paper/1990. Keyword(s): NeurIPS, machine learning, artificial intelligence, neural networks.



BACK TO INDEX




Disclaimer:

This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders.




Last modified: Sat Sep 27 12:15:53 2025
Author: sontag.


This document was translated from BibTEX by bibtex2html