Multimodal Spiking Neural Network for Space Robotic Manipulation
Multimodal Spiking Neural Network for Space Robotic Manipulation
This paper presents a multimodal control framework based on spiking neural networks (SNNs) for robotic arms aboard space stations. It is designed to cope with the constraints of limited onboard resources while enabling autonomous manipulation and material transfer in space operations. By combining geometric states with tactile and semantic information, the framework strengthens environmental awareness and contributes to more robust control strategies. To guide the learning process progressively, a dual-channel, three-stage curriculum reinforcement learning (CRL) scheme is further integrated into the system. The framework was tested across a range of tasks including target approach, object grasping, and stable lifting with wall-mounted robotic arms, demonstrating reliable performance throughout. Experimental evaluations demonstrate that the proposed method consistently outperforms baseline approaches in both task success rate and energy efficiency. These findings highlight its suitability for real-world aerospace applications.
Liwen Zhang、Dong Zhou、Shibo Shao、Zihao Su、Guanghui Sun
航空航天技术自动化技术、自动化技术设备
Liwen Zhang,Dong Zhou,Shibo Shao,Zihao Su,Guanghui Sun.Multimodal Spiking Neural Network for Space Robotic Manipulation[EB/OL].(2025-08-10)[2025-08-24].https://arxiv.org/abs/2508.07287.点此复制
评论