SpaRC-AD: A Baseline for Radar-Camera Fusion in End-to-End Autonomous Driving
SpaRC-AD: A Baseline for Radar-Camera Fusion in End-to-End Autonomous Driving
End-to-end autonomous driving systems promise stronger performance through unified optimization of perception, motion forecasting, and planning. However, vision-based approaches face fundamental limitations in adverse weather conditions, partial occlusions, and precise velocity estimation - critical challenges in safety-sensitive scenarios where accurate motion understanding and long-horizon trajectory prediction are essential for collision avoidance. To address these limitations, we propose SpaRC-AD, a query-based end-to-end camera-radar fusion framework for planning-oriented autonomous driving. Through sparse 3D feature alignment, and doppler-based velocity estimation, we achieve strong 3D scene representations for refinement of agent anchors, map polylines and motion modelling. Our method achieves strong improvements over the state-of-the-art vision-only baselines across multiple autonomous driving tasks, including 3D detection (+4.8% mAP), multi-object tracking (+8.3% AMOTA), online mapping (+1.8% mAP), motion prediction (-4.0% mADE), and trajectory planning (-0.1m L2 and -9% TPC). We achieve both spatial coherence and temporal consistency on multiple challenging benchmarks, including real-world open-loop nuScenes, long-horizon T-nuScenes, and closed-loop simulator Bench2Drive. We show the effectiveness of radar-based fusion in safety-critical scenarios where accurate motion understanding and long-horizon trajectory prediction are essential for collision avoidance. The source code of all experiments is available at https://phi-wol.github.io/sparcad/
Philipp Wolters、Johannes Gilg、Torben Teepe、Gerhard Rigoll
雷达
Philipp Wolters,Johannes Gilg,Torben Teepe,Gerhard Rigoll.SpaRC-AD: A Baseline for Radar-Camera Fusion in End-to-End Autonomous Driving[EB/OL].(2025-08-14)[2025-08-24].https://arxiv.org/abs/2508.10567.点此复制
评论