|国家预印本平台
首页|Using Context-to-Vector with Graph Retrofitting to Improve Word Embeddings

Using Context-to-Vector with Graph Retrofitting to Improve Word Embeddings

Using Context-to-Vector with Graph Retrofitting to Improve Word Embeddings

来源:Arxiv_logoArxiv
英文摘要

Although contextualized embeddings generated from large-scale pre-trained models perform well in many tasks, traditional static embeddings (e.g., Skip-gram, Word2Vec) still play an important role in low-resource and lightweight settings due to their low computational cost, ease of deployment, and stability. In this paper, we aim to improve word embeddings by 1) incorporating more contextual information from existing pre-trained models into the Skip-gram framework, which we call Context-to-Vec; 2) proposing a post-processing retrofitting method for static embeddings independent of training by employing priori synonym knowledge and weighted vector distribution. Through extrinsic and intrinsic tasks, our methods are well proven to outperform the baselines by a large margin.

Stan Z. Li、Jun Xia、Yue Zhang、Yile Wang、Ge Wang、Guojiang Zhao、Jiangbin Zheng、Yufei Huang

计算技术、计算机技术

Stan Z. Li,Jun Xia,Yue Zhang,Yile Wang,Ge Wang,Guojiang Zhao,Jiangbin Zheng,Yufei Huang.Using Context-to-Vector with Graph Retrofitting to Improve Word Embeddings[EB/OL].(2022-10-30)[2025-08-02].https://arxiv.org/abs/2210.16848.点此复制

评论