In this article, we present a novel approach to training spiking neural networks (SNNs) that leverages the power of recurrent neural networks (RNNs). Our proposed method is designed to address two significant challenges in SNN training: computational complexity and non-stationarity.
To overcome these challenges, we introduce a windowed loss function that reduces the computational effort while still utilizing all available training data. By dividing the input sequences into smaller segments and computing the loss for each segment separately, we can significantly reduce the simulation length without sacrificing accuracy.
Moreover, we propose a rejection sampling technique to select the most informative samples for training. This method allows us to discard redundant or irrelevant data while emphasizing the learning of spiking behavior. By combining windowed loss and rejection sampling, our approach provides an effective solution to the challenges in SNN training.
Our proposed method is based on the semigroup property of the flow function, which enables us to construct a new loss function by considering shorter segments of output trajectories. This allows us to reduce the length of input sequences while still making use of all training data.
In summary, our article presents a novel approach to training SNNs that leverages RNNs to address the computational complexity and non-stationarity issues. By combining windowed loss and rejection sampling techniques, we provide an effective solution for learning flow functions of spiking neural networks. Our method has important implications for understanding the behavior of SNNs and could potentially lead to more accurate and efficient training methods in the future.
Electrical Engineering and Systems Science, Systems and Control