TITLE: Training Asymptotically Stable Recurrent Neural Networks AUTHORS: Nikitas J. Dimopoulos, John Dorocicz, Chris Jubien, and Stephen W. Neville IN: Intelligent Automation and Soft Computing, Vol. 2, No. 4, 1996, pp. 375-388. ABSTRACT In this work we present a class of recurrent networks which are asymptotically stable. For these networks, we discuss their similarity with certain structures in the central nervous system, and prove that if an interconnection pattern that does not allow excitatory feedback is used, then the resulting recurrent neural network is stable. We introduce a training methodology for networks belonging to this class, and use it to train networks that successfully identify a number nonlinear systems.