基于神经网络的相位补偿方法
Neural Network-Based Phase Compensation Method for Laser Array Systems
传统的自适应光学(AO)技术使用各种优化算法对波前畸变进行多次迭代补偿,但其收敛速度慢,限制了其在实时系统中的应用。为了解决这个问题,本研究提出了一种基于神经网络的激光器阵列相位校正方法。即利用卷积神经网络(CNN)进行相位的快速预测与校正。本文首先针对3×3的激光阵列,训练CNN模型处理光斑图像并预测并校正相位。实验结果显示,该CNN模型能将平均桶中功率(PIB)提升至0.881。针对更复杂的6×6激光阵列,通过将其分割为多个3×3子阵列并应用CNN进行补偿,单次预测就可以达到较好的补偿效过,相较于传统的SPGD算法,CNN大幅降低了迭代次数,提高了处理速度。更进一步的研究表明,将CNN模型与SPGD算法结合使用可以进一步提高光束质量,与传统SPGD算法相比迭代速度提升了4倍。因此,本研究为实时高质量激光阵列相位补偿提供了一种基于深度学习的有效解决方案,证实了基于CNN的激光阵列相位补偿技术在系统实时性方面具有显著优势。
Facing the challenge of slow convergence in traditional adaptive optics (AO) systems, this study proposes a novel phase correction method for laser arrays based on neural networks, utilizing convolutional neural networks (CNNs) for rapid phase prediction and correction. The paper first focuses on a 3×3 laser array, for which a CNN model is trained to process speckle images and predict as well as correct the phase. Experimental results show that this CNN model successfully enhances the mean bucket power (PIB) up to 0.881. For more complex 6×6 laser arrays, dividing them into multiple 3×3 sub-arrays and applying the CNN for compensation allows for effective correction after a single prediction. Compared to traditional Stochastic Parallel Gradient Descent (SPGD) algorithms, the CNN significantly reduces the number of iterations and increases processing speed. Further studies indicate that combining the CNN model with the SPGD algorithm can improve beam quality even more, achieving an iteration speed four times faster than traditional SPGD algorithms. Therefore, this research provides an efficient deep learning-based solution for real-time, high-quality laser array phase compensation, confirming the significant advantages of CNN-based laser array phase correction methods in terms of system real-time performance.
洪小斌、何颖
光电子技术
自适应光学相干合束卷积神经网络波前校正
daptive opticsCoherent beam combiningConvolutional neural networksWavefront correction
洪小斌,何颖.基于神经网络的相位补偿方法[EB/OL].(2024-03-13)[2025-08-18].http://www.paper.edu.cn/releasepaper/content/202403-114.点此复制
评论