|国家预印本平台
首页|Scalable Forward-Forward Algorithm

Scalable Forward-Forward Algorithm

Scalable Forward-Forward Algorithm

来源:Arxiv_logoArxiv
英文摘要

We propose a scalable Forward-Forward (FF) algorithm that eliminates the need for backpropagation by training each layer separately. Unlike backpropagation, FF avoids backward gradients and can be more modular and memory efficient, making it appealing for large networks. We extend FF to modern convolutional architectures, such as MobileNetV3 and ResNet18, by introducing a new way to compute losses for convolutional layers. Experiments show that our method achieves performance comparable to standard backpropagation. Furthermore, when we divide the network into blocks, such as the residual blocks in ResNet, and apply backpropagation only within each block, but not across blocks, our hybrid design tends to outperform backpropagation baselines while maintaining a similar training speed. Finally, we present experiments on small datasets and transfer learning that confirm the adaptability of our method.

Andrii Krutsylo

计算技术、计算机技术

Andrii Krutsylo.Scalable Forward-Forward Algorithm[EB/OL].(2025-01-06)[2025-08-03].https://arxiv.org/abs/2501.03176.点此复制

评论