|国家预印本平台
首页|Orthogonal Low Rank Embedding Stabilization

Orthogonal Low Rank Embedding Stabilization

Orthogonal Low Rank Embedding Stabilization

来源:Arxiv_logoArxiv
英文摘要

The instability of embedding spaces across model retraining cycles presents significant challenges to downstream applications using user or item embeddings derived from recommendation systems as input features. This paper introduces a novel orthogonal low-rank transformation methodology designed to stabilize the user/item embedding space, ensuring consistent embedding dimensions across retraining sessions. Our approach leverages a combination of efficient low-rank singular value decomposition and orthogonal Procrustes transformation to map embeddings into a standardized space. This transformation is computationally efficient, lossless, and lightweight, preserving the dot product and inference quality while reducing operational burdens. Unlike existing methods that modify training objectives or embedding structures, our approach maintains the integrity of the primary model application and can be seamlessly integrated with other stabilization techniques.

Kevin Zielnicki、Ko-Jen Hsiao

10.1145/3705328.3748141

计算技术、计算机技术

Kevin Zielnicki,Ko-Jen Hsiao.Orthogonal Low Rank Embedding Stabilization[EB/OL].(2025-08-11)[2025-08-24].https://arxiv.org/abs/2508.07574.点此复制

评论