What Time Tells Us? An Explorative Study of Time Awareness Learned from Static Images
What Time Tells Us? An Explorative Study of Time Awareness Learned from Static Images
Time becomes visible through illumination changes in what we see. Inspired by this, in this paper we explore the potential to learn time awareness from static images, trying to answer: what time tells us? To this end, we first introduce a Time-Oriented Collection (TOC) dataset, which contains 130,906 images with reliable timestamps. Leveraging this dataset, we propose a Time-Image Contrastive Learning (TICL) approach to jointly model timestamps and related visual representations through cross-modal contrastive learning. We found that the proposed TICL, 1) not only achieves state-of-the-art performance on the timestamp estimation task, over various benchmark metrics, 2) but also, interestingly, though only seeing static images, the time-aware embeddings learned from TICL show strong capability in several time-aware downstream tasks such as time-based image retrieval, video scene classification, and time-aware image editing. Our findings suggest that time-related visual cues can be learned from static images and are beneficial for various vision tasks, laying a foundation for future research on understanding time-related visual context. Project page:https://rathgrith.github.io/timetells/.
Dongheng Lin、Han Hu、Jianbo Jiao
计算技术、计算机技术
Dongheng Lin,Han Hu,Jianbo Jiao.What Time Tells Us? An Explorative Study of Time Awareness Learned from Static Images[EB/OL].(2025-03-22)[2025-04-26].https://arxiv.org/abs/2503.17899.点此复制
评论