A Neural Network Framework Based on Symmetric Differential Equations

Authors: Kun Jiang
Conference: ICIC 2025 Posters, Ningbo, China, July 26-29, 2025
Pages: 1277-1292
Keywords: Symmetric differential equations, Fixed point, Multilayer perceptron, Neural net-work, Backward propagation.

Abstract

Modern mathematical neural networks are derived from biological neural net-works, yet the currently popular general large models do not incorporate biologi-cal neural networks. The primary reason for this is that the differential equations based on biological neural networks are difficult to manipulate. At present, math-ematical neural networks are characterized by their capacity for large-scale de-ployment, while biological neural networks offer strong biological interpretabil-ity. This paper introduces a system of differential equations with perfect sym-metry and convenient manipulability, enabling us to manipulate this system as easily as we manipulate numbers in a matrix, thus integrating the advantages of both. As we are introducing a brand-new neural network framework, we first ex-plore the mathematical properties of the differential equations, then define a new signal propagation method, and finally propose a new training approach for the neural network. The training of this new neural network does not rely on the tra-ditional back-propagation algorithm instead, it depends solely on the propagation of local signals. This implies that we no longer require global information to train the network. Each neuron can adjust based on the signals it receives and its pre-determined strategy. As a verification, we mimicked the linking method of a mul-tilayer perceptron MLP to create a new neural network and trained it on the MNIST dataset, demonstrating the effectiveness of our methodology.
📄 View Full Paper (PDF) 📋 Show Citation