PBSpikformer: A Pure Spike-Driven Spiking Neural Network with Fourier-Phase Attention and Dynamic Batch Context

Authors: Chengfan Yang, Tao Deng, Ran Li, and Fei Yan
Conference: ICIC 2025 Posters, Ningbo, China, July 26-29, 2025
Pages: 1432-1443
Keywords: Spiking neural network, Fourier transform, Attention mechanism

Abstract

Spiking Neural Networks SNNs , inspired by biological neurons, have gained increasing attention for their energy efficiency and event-driven computation. However, their binary nature and complex dynamics make it difficult to train high-performance, low-latency models, limiting their progress compared to Artificial Neural Networks ANNs . To address these challenges, we propose PBSpikformer, a directly trainable spiking Transformer architecture that incorporates two novel components: Fourier-Phase Attention FPA and Dynamic Batch Context DBC . FPA combines spike-based Q-K token attention with Spectral Cross-Modal Augmentation SCMA to effectively fuse spatial, temporal, and frequency-domain features while reducing computational complexity. DBC introduces batch-level global signals to modulate local and global activations, improving gradient flow and training robustness. Extensive experiments show that PBSpikformer outperforms existing SNN models across multiple benchmarks, achieving 96.7 accuracy on CIFAR10-DVS—a 12.7 improvement over previous methods—and becomes the first directly trained SNN to surpass 90 accuracy on this dataset.
📄 View Full Paper (PDF) 📋 Show Citation