|国家预印本平台
首页|Low-resource domain adaptation while minimizing energy and hardware resource consumption

Low-resource domain adaptation while minimizing energy and hardware resource consumption

Low-resource domain adaptation while minimizing energy and hardware resource consumption

来源:Arxiv_logoArxiv
英文摘要

Training Large Language Models (LLMs) is costly in terms of energy, hardware, and annotated data, often resulting in a positionality rooted in predominant cultures and values (Santy et al., 2023). Domain adaptation has emerged as a promising strategy to better align models with diverse cultural and value contexts (Hershcovich et al., 2022), but its computational cost remains a significant barrier, particularly for research groups lacking access to large-scale infrastructure. In this paper, we evaluate how the use of different numerical precision formats and data parallelization strategies impacts both training speed (as a proxy to energy and hardware consumption) and model accuracy, with the goal of facilitating domain adaptation in low-resource environments. Our findings are relevant to any setting where energy efficiency, accessibility, or limited hardware availability are key concerns.

Hernán Maina、Nicolás Wolovick、Luciana Benotti

计算技术、计算机技术

Hernán Maina,Nicolás Wolovick,Luciana Benotti.Low-resource domain adaptation while minimizing energy and hardware resource consumption[EB/OL].(2025-06-10)[2025-06-24].https://arxiv.org/abs/2506.08433.点此复制

评论