GraspVLA: a Grasping Foundation Model Pre-trained on Billion-scale Synthetic Action Data
GraspVLA: a Grasping Foundation Model Pre-trained on Billion-scale Synthetic Action Data
Embodied foundation models are gaining increasing attention for their zero-shot generalization, scalability, and adaptability to new tasks through few-shot post-training. However, existing models rely heavily on real-world data, which is costly and labor-intensive to collect. Synthetic data offers a cost-effective alternative, yet its potential remains largely underexplored. To bridge this gap, we explore the feasibility of training Vision-Language-Action models entirely with large-scale synthetic action data. We curate SynGrasp-1B, a billion-frame robotic grasping dataset generated in simulation with photorealistic rendering and extensive domain randomization. Building on this, we present GraspVLA, a VLA model pretrained on large-scale synthetic action data as a foundational model for grasping tasks. GraspVLA integrates autoregressive perception tasks and flow-matching-based action generation into a unified Chain-of-Thought process, enabling joint training on synthetic action data and Internet semantics data. This design helps mitigate sim-to-real gaps and facilitates the transfer of learned actions to a broader range of Internet-covered objects, achieving open-vocabulary generalization in grasping. Extensive evaluations across real-world and simulation benchmarks demonstrate GraspVLA's advanced zero-shot generalizability and few-shot adaptability to specific human preferences. We will release SynGrasp-1B dataset and pre-trained weights to benefit the community.
Yuxin Yang、Jiayi Chen、Zhiqi Zhang、Taoyu Yang、Xuheng Zhang、Heming Cui、Zhizheng Zhang、He Wang、Shengliang Deng、Mi Yan、Songlin Wei、Haixin Ma
自动化技术、自动化技术设备计算技术、计算机技术
Yuxin Yang,Jiayi Chen,Zhiqi Zhang,Taoyu Yang,Xuheng Zhang,Heming Cui,Zhizheng Zhang,He Wang,Shengliang Deng,Mi Yan,Songlin Wei,Haixin Ma.GraspVLA: a Grasping Foundation Model Pre-trained on Billion-scale Synthetic Action Data[EB/OL].(2025-05-06)[2025-05-28].https://arxiv.org/abs/2505.03233.点此复制
评论