|国家预印本平台
首页|The Case for Instance-Optimized LLMs in OLAP Databases

The Case for Instance-Optimized LLMs in OLAP Databases

The Case for Instance-Optimized LLMs in OLAP Databases

来源:Arxiv_logoArxiv
英文摘要

Large Language Models (LLMs) can enhance analytics systems with powerful data summarization, cleaning, and semantic transformation capabilities. However, deploying LLMs at scale -- processing millions to billions of rows -- remains prohibitively expensive in computation and memory. We present IOLM-DB, a novel system that makes LLM-enhanced database queries practical through query-specific model optimization. Instead of using general-purpose LLMs, IOLM-DB generates lightweight, specialized models tailored to each query's specific needs using representative data samples. IOLM-DB reduces model footprints by up to 76% and increases throughput by up to 3.31$\times$ while maintaining accuracy through aggressive compression techniques, including quantization, sparsification, and structural pruning. We further show how our approach enables higher parallelism on existing hardware and seamlessly supports caching and batching strategies to reduce overheads. Our prototype demonstrates that leveraging LLM queries inside analytics systems is feasible at scale, opening new possibilities for future OLAP applications.

Bardia Mohammadi、Laurent Bindschaedler

计算技术、计算机技术

Bardia Mohammadi,Laurent Bindschaedler.The Case for Instance-Optimized LLMs in OLAP Databases[EB/OL].(2025-07-07)[2025-07-21].https://arxiv.org/abs/2507.04967.点此复制

评论