Live Demonstration: Neuromorphic Radar for Gesture Recognition
Live Demonstration: Neuromorphic Radar for Gesture Recognition
We present a neuromorphic radar framework for real-time, low-power hand gesture recognition (HGR) using an event-driven architecture inspired by biological sensing. Our system comprises a 24 GHz Doppler radar front-end and a custom neuromorphic sampler that converts intermediate-frequency (IF) signals into sparse spike-based representations via asynchronous sigma-delta encoding. These events are directly processed by a lightweight neural network deployed on a Cortex-M0 microcontroller, enabling low-latency inference without requiring spectrogram reconstruction. Unlike conventional radar HGR pipelines that continuously sample and process data, our architecture activates only when meaningful motion is detected, significantly reducing memory, power, and computation overhead. Evaluated on a dataset of five gestures collected from seven users, our system achieves > 85% real-time accuracy. To the best of our knowledge, this is the first work that employs bio-inspired asynchronous sigma-delta encoding and an event-driven processing framework for radar-based HGR.
Satyapreet Singh Yadav、Akash K S、Chandra Sekhar Seelamantula、Chetan Singh Thakur
雷达
Satyapreet Singh Yadav,Akash K S,Chandra Sekhar Seelamantula,Chetan Singh Thakur.Live Demonstration: Neuromorphic Radar for Gesture Recognition[EB/OL].(2025-08-06)[2025-08-16].https://arxiv.org/abs/2508.03324.点此复制
评论