BACK TO INDEX
Publications of Eduardo D. Sontag jointly with R. Gavaldą
J. L. Balcįzar,
H. T. Siegelmann,
and E.D. Sontag.
Some structural complexity aspects of neural computation.
In Proceedings of the Eighth Annual Structure in Complexity Theory Conference (San Diego, CA, 1993),
Los Alamitos, CA,
IEEE Comput. Soc. Press.
Keyword(s): machine learning,
theory of computing and complexity.
Recent work by H.T. Siegelmann and E.D. Sontag (1992) has demonstrated that polynomial time on linear saturated recurrent neural networks equals polynomial time on standard computational models: Turing machines if the weights of the net are rationals, and nonuniform circuits if the weights are real. Here, further connections between the languages recognized by such neural nets and other complexity classes are developed. Connections to space-bounded classes, simulation of parallel computational models such as Vector Machines, and a discussion of the characterizations of various nonuniform classes in terms of Kolmogorov complexity are presented.
BACK TO INDEX
This material is presented to ensure timely dissemination of
scholarly and technical work. Copyright and all rights therein
are retained by authors or by other copyright holders.
Last modified: Wed Aug 17 10:22:06 2022
This document was translated from BibTEX by