|国家预印本平台
首页|Graph Laplacian Wavelet Transformer via Learnable Spectral Decomposition

Graph Laplacian Wavelet Transformer via Learnable Spectral Decomposition

Graph Laplacian Wavelet Transformer via Learnable Spectral Decomposition

来源:Arxiv_logoArxiv
英文摘要

Existing sequence to sequence models for structured language tasks rely heavily on the dot product self attention mechanism, which incurs quadratic complexity in both computation and memory for input length N. We introduce the Graph Wavelet Transformer (GWT), a novel architecture that replaces this bottleneck with a learnable, multi scale wavelet transform defined over an explicit graph Laplacian derived from syntactic or semantic parses. Our analysis shows that multi scale spectral decomposition offers an interpretable, efficient, and expressive alternative to quadratic self attention for graph structured sequence modeling.

Andrew Kiruluta、Eric Lundy、Priscilla Burity

计算技术、计算机技术

Andrew Kiruluta,Eric Lundy,Priscilla Burity.Graph Laplacian Wavelet Transformer via Learnable Spectral Decomposition[EB/OL].(2025-05-08)[2025-06-12].https://arxiv.org/abs/2505.07862.点此复制

评论