|国家预印本平台
首页|Towards a General-Purpose Zero-Shot Synthetic Low-Light Image and Video Pipeline

Towards a General-Purpose Zero-Shot Synthetic Low-Light Image and Video Pipeline

Towards a General-Purpose Zero-Shot Synthetic Low-Light Image and Video Pipeline

来源:Arxiv_logoArxiv
英文摘要

Low-light conditions pose significant challenges for both human and machine annotation. This in turn has led to a lack of research into machine understanding for low-light images and (in particular) videos. A common approach is to apply annotations obtained from high quality datasets to synthetically created low light versions. In addition, these approaches are often limited through the use of unrealistic noise models. In this paper, we propose a new Degradation Estimation Network (DEN), which synthetically generates realistic standard RGB (sRGB) noise without the requirement for camera metadata. This is achieved by estimating the parameters of physics-informed noise distributions, trained in a self-supervised manner. This zero-shot approach allows our method to generate synthetic noisy content with a diverse range of realistic noise characteristics, unlike other methods which focus on recreating the noise characteristics of the training data. We evaluate our proposed synthetic pipeline using various methods trained on its synthetic data for typical low-light tasks including synthetic noise replication, video enhancement, and object detection, showing improvements of up to 24\% KLD, 21\% LPIPS, and 62\% AP$_{50-95}$, respectively.

Joanne Lin、Crispian Morris、Ruirui Lin、Fan Zhang、David Bull、Nantheera Anantrasirichai

计算技术、计算机技术

Joanne Lin,Crispian Morris,Ruirui Lin,Fan Zhang,David Bull,Nantheera Anantrasirichai.Towards a General-Purpose Zero-Shot Synthetic Low-Light Image and Video Pipeline[EB/OL].(2025-04-16)[2025-06-27].https://arxiv.org/abs/2504.12169.点此复制

评论