BACK TO INDEX
Publications about 'empirical risk minimization'
Articles in journal or book chapters
and E.D. Sontag.
Learning recurrent neural net models of nonlinear systems.
Proc. of Machine Learning Research,
Keyword(s): machine learning,
empirical risk minimization,
recurrent neural networks,
statistical learning theory,
This paper considers the following learning problem: given sample pairs of input and output signals generated by an unknown nonlinear system (which is not assumed to be causal or time-invariant), one wishes to find a continuous-time recurrent neural net, with activation function tanh, that approximately reproduces the underlying i/o behavior with high confidence. Leveraging earlier work concerned with matching derivatives up to a finite order of the input and output signals the problem is reformulated in familiar system-theoretic language and quantitative guarantees on the sup-norm risk of the learned model are derived, in terms of the number of neurons, the sample size, the number of derivatives being matched, and the regularity properties of the inputs, the outputs, and the unknown i/o map.
BACK TO INDEX
This material is presented to ensure timely dissemination of
scholarly and technical work. Copyright and all rights therein
are retained by authors or by other copyright holders.
Last modified: Mon Nov 7 18:17:06 2022
This document was translated from BibTEX by