|国家预印本平台
首页|Sinusoidal Initialization, Time for a New Start

Sinusoidal Initialization, Time for a New Start

Sinusoidal Initialization, Time for a New Start

来源:Arxiv_logoArxiv
英文摘要

Initialization plays a critical role in Deep Neural Network training, directly influencing convergence, stability, and generalization. Common approaches such as Glorot and He initializations rely on randomness, which can produce uneven weight distributions across layer connections. In this paper, we introduce the Sinusoidal initialization, a novel deterministic method that employs sinusoidal functions to construct structured weight matrices expressly to improve the spread and balance of weights throughout the network while simultaneously fostering a more uniform, well-conditioned distribution of neuron activation states from the very first forward pass. Because Sinusoidal initialization begins with weights and activations that are already evenly and efficiently utilized, it delivers consistently faster convergence, greater training stability, and higher final accuracy across a wide range of models, including convolutional neural networks, vision transformers, and large language models. On average, our experiments show an increase of 4.9% in final validation accuracy and 20.9% in convergence speed. By replacing randomness with structure, this initialization provides a stronger and more reliable foundation for Deep Learning systems.

Alberto Fernández-Hernández、Jose I. Mestre、Manuel F. Dolz、Jose Duato、Enrique S. Quintana-Ortí

计算技术、计算机技术

Alberto Fernández-Hernández,Jose I. Mestre,Manuel F. Dolz,Jose Duato,Enrique S. Quintana-Ortí.Sinusoidal Initialization, Time for a New Start[EB/OL].(2025-05-19)[2025-06-28].https://arxiv.org/abs/2505.12909.点此复制

评论