|国家预印本平台
首页|All Eyes, no IMU: Learning Flight Attitude from Vision Alone

All Eyes, no IMU: Learning Flight Attitude from Vision Alone

All Eyes, no IMU: Learning Flight Attitude from Vision Alone

来源:Arxiv_logoArxiv
英文摘要

Vision is an essential part of attitude control for many flying animals, some of which have no dedicated sense of gravity. Flying robots, on the other hand, typically depend heavily on accelerometers and gyroscopes for attitude stabilization. In this work, we present the first vision-only approach to flight control for use in generic environments. We show that a quadrotor drone equipped with a downward-facing event camera can estimate its attitude and rotation rate from just the event stream, enabling flight control without inertial sensors. Our approach uses a small recurrent convolutional neural network trained through supervised learning. Real-world flight tests demonstrate that our combination of event camera and low-latency neural network is capable of replacing the inertial measurement unit in a traditional flight control loop. Furthermore, we investigate the network's generalization across different environments, and the impact of memory and different fields of view. While networks with memory and access to horizon-like visual cues achieve best performance, variants with a narrower field of view achieve better relative generalization. Our work showcases vision-only flight control as a promising candidate for enabling autonomous, insect-scale flying robots.

Jesse J. Hagenaars、Stein Stroobants、Sander M. Bohte、Guido C. H. E. De Croon

航空航天技术航空

Jesse J. Hagenaars,Stein Stroobants,Sander M. Bohte,Guido C. H. E. De Croon.All Eyes, no IMU: Learning Flight Attitude from Vision Alone[EB/OL].(2025-07-15)[2025-07-23].https://arxiv.org/abs/2507.11302.点此复制

评论