Recurrent neural networks (RNNs) are major machine learning tools for the processing of sequential data. Piecewise-linear RNNs (PLRNNs) in particular, which are formally piecewise linear (PWL) maps, have become popular recently as data-driven techniques for dynamical systems reconstructions from time-series observations. For a better understanding of the training process, performance, and behavior of trained PLRNNs, more thorough theoretical analysis is highly needed. Especially the presence of chaos strongly affects RNN training and expressivity. Here we show the existence of robust chaos in 2d PLRNNs. To this end, necessary and sufficient conditions for the occurrence of homoclinic intersections are derived by analyzing the interplay between stable and unstable manifolds of 2d PWL maps. Our analysis focuses on general PWL maps, like PLRNNs, since normal form PWL maps lack important characteristics that can occur in PLRNNs. We also explore some bifurcations and multi-stability involving chaos, since the co-existence of chaotic attractors with other attractor objects poses particular challenges for PLRNN training on the one hand, yet may endow trained PLRNNs with important computational properties on the other. Numerical simulations are performed to verify our results and are demonstrated to be in good agreement with the theoretical derivations. We discuss the implications of our results for PLRNN training, performance on machine learning tasks, and scientific applications.
Mathematics Subject Classification (2020) 37F46 · 34H10