A parametric activation function based on Wendland RBF
A parametric activation function based on Wendland RBF
This paper introduces a novel parametric activation function based on Wendland radial basis functions (RBFs) for deep neural networks. Wendland RBFs, known for their compact support, smoothness, and positive definiteness in approximation theory, are adapted to address limitations of traditional activation functions like ReLU, sigmoid, and tanh. The proposed enhanced Wendland activation combines a standard Wendland component with linear and exponential terms, offering tunable locality, improved gradient propagation, and enhanced stability during training. Theoretical analysis highlights its mathematical properties, including smoothness and adaptability, while empirical experiments on synthetic tasks (e.g., sine wave approximation) and benchmark datasets (MNIST, Fashion-MNIST) demonstrate competitive performance. Results show that the Wendland-based activation achieves superior accuracy in certain scenarios, particularly in regression tasks, while maintaining computational efficiency. The study bridges classical RBF theory with modern deep learning, suggesting that Wendland activations can mitigate overfitting and improve generalization through localized, smooth transformations. Future directions include hybrid architectures and domain-specific adaptations.
Majid Darehmiraki
数学计算技术、计算机技术
Majid Darehmiraki.A parametric activation function based on Wendland RBF[EB/OL].(2025-06-28)[2025-07-25].https://arxiv.org/abs/2507.11493.点此复制
评论