|国家预印本平台
首页|PGKET: A Photonic Gaussian Kernel Enhanced Transformer

PGKET: A Photonic Gaussian Kernel Enhanced Transformer

PGKET: A Photonic Gaussian Kernel Enhanced Transformer

来源:Arxiv_logoArxiv
英文摘要

Self-Attention Mechanisms (SAMs) enhance model performance by extracting key information but are inefficient when dealing with long sequences. To this end, a photonic Gaussian Kernel Enhanced Transformer (PGKET) is proposed, based on the Photonic Gaussian Kernel Self-Attention Mechanism (PGKSAM). The PGKSAM calculates the Photonic Gaussian Kernel Self-Attention Score (PGKSAS) using photon interferometry and superposition to process multiple inputs in parallel. Experimental results show that PGKET outperforms some state-of-the-art transformers in multi-classification tasks on MedMNIST v2 and CIFAR-10, and is expected to improve performance in complex tasks and accelerate the convergence of Photonic Computing (PC) and machine learning.

Ren-Xin Zhao

计算技术、计算机技术

Ren-Xin Zhao.PGKET: A Photonic Gaussian Kernel Enhanced Transformer[EB/OL].(2025-07-25)[2025-08-10].https://arxiv.org/abs/2507.19041.点此复制

评论