|国家预印本平台
首页|对偶神经网络中激活函数对的改进

对偶神经网络中激活函数对的改进

he improvement of activation function to dual neural network

中文摘要英文摘要

对偶神经网络的多重定积分计算方法相比于传统的数值积分方法,优势在于可以获得被积函数的原函数,但在实际训练过程中精度和效率仍无法达到预期。为此,本文拟通过构造对偶神经网络中新的激活函数对来解决此类问题,引入了sigmoid()/softplus()做为对偶神经网络新的激活函数对。通过算例仿真,相比于采用传统激活函数对的对偶神经网络,含有激活函数对sigmoid()/ softplus()的对偶神经网络精度更高、收敛速度更快。

Compared with traditional method of numerical calculation, the dual neural network calculation method of multiple definite integral has the advantages of high efficiency and high precision, but in the process of actual traing, the precision and efficiency can not meet expectations. Hence, a great deal of studies focus on the construction of new activation functions of dual neural network, in this paper, we apply sigmoid/softplus as the new activation functions. The simulation by examples, compared with the dual neural network with traditional activation functions, the results show that the dual neural network with sigmoid/softplus activition functions has higher precision and can increase convergence speed.

李海滨、李尚杰

计算技术、计算机技术

对偶神经网络多重定积分sigmoid()/softplus()激活函数对

ual neural networkMultiple definite integralsigmoid/softplusactivition functions

李海滨,李尚杰.对偶神经网络中激活函数对的改进[EB/OL].(2018-03-29)[2025-08-16].http://www.paper.edu.cn/releasepaper/content/201803-265.点此复制

评论