Chaos Control in Recurrent Neural Networks Using a Sinusoidal Activation Function via the Periodic Pulse Method
Main Article Content
Abstract
Controlling chaos in recurrent neural networks (RNNs) is a crucial challenge in both computational neuroscience and artificial intelligence. Chaotic behavior in these networks can hinder stability and predictability, particularly in systems requiring structured memory and temporal processing. In this study, we apply the periodic pulse method to stabilize the dynamics of chaotic RNNs using a sinusoidal activation function. Two network configurations (2 and 3 neurons) were analyzed using numerical simulations in MATLAB. Our results show that the periodic pulse method effectively suppresses chaotic behavior, as evidenced by a reduction of the largest Lyapunov exponent from 0.317 to -0.042. The system transitions from an unpredictable regime to a stabilized fixed point. This confirms the method’s potential to regulate nonlinear neural dynamics with minimal external perturbations. Future work will focus on extending this approach to larger recurrent networks (LSTMs, reservoir computing models) and comparing its performance with other chaos control strategies such as delayed feedback and chaotic synchronization. This study contributes to the understanding of chaos in neural networks and its potential applications in neuroscience and AI.
Article Details

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
References
- Bar-yam Y. Dynamics Of Complex Systems. CRC Press, 2019. https://doi.org/10.1201/9780429034961
- Strogatz SH. Nonlinear Dynamics and Chaos: With Applications to Physics, Biology, Chemistry, and Engineering. 2ed. Boca Raton: CRC Press, 2018: 532.
- Hopfield JJ. Neural networks and physical systems with emergent collective computational abilities. Proceedings of the National Academy of Sciences. 1982, 79(8): 2554-2558. https://doi.org/10.1073/pnas.79.8.2554
- Hinton GE, Osindero S, Teh YW. A Fast Learning Algorithm for Deep Belief Nets. Neural Computation. 2006, 18(7): 1527-1554. https://doi.org/10.1162/neco.2006.18.7.1527
- Freeman WJ, Holmes MD. Metastability, instability, and state transition in neocortex. Neural Networks. 2005, 18(5-6): 497-504. https://doi.org/10.1016/j.neunet.2005.06.014
- Breakspear M, Heitmann S, Daffertshofer A. Generative Models of Cortical Oscillations: Neurobiological Implications of the Kuramoto Model. Frontiers in Human Neuroscience. 2010, 4. https://doi.org/10.3389/fnhum.2010.00190
- Lopes da Silva F. EEG and MEG: Relevance to Neuroscience. Neuron. 2013, 80(5): 1112-1128. https://doi.org/10.1016/j.neuron.2013.10.017
- Stam CJ. Modern network science of neurological disorders. Nature Reviews Neuroscience. 2014, 15(10): 683-695. https://doi.org/10.1038/nrn3801
- Pasemann F. A simple chaotic neuron. Physica D: Nonlinear Phenomena. 1997, 104(2): 205-211. https://doi.org/10.1016/S0167-2789(96)00239-4
- Dreyfus G. Neural Networks: Methodology and Applications. Springer Science & Business Media. 2005: 509.
- Pecora LM, Carroll TL. Synchronization in chaotic systems. Physical Review Letters. 1990, 64(8): 821-824. https://doi.org/10.1103/physrevlett.64.821
- Lynch S. Dynamical Systems with Applications Using MATLAB®. Springer International Publishing, 2014. https://doi.org/10.1007/978-3-319-06820-6