Improved Decision Module Selection for Hierarchical Inference in Resource-Constrained Edge Devices
Improved Decision Module Selection for Hierarchical Inference in Resource-Constrained Edge Devices
The Hierarchical Inference (HI) paradigm employs a tiered processing: the inference from simple data samples are accepted at the end device, while complex data samples are offloaded to the central servers. HI has recently emerged as an effective method for balancing inference accuracy, data processing, transmission throughput, and offloading cost. This approach proves particularly efficient in scenarios involving resource-constrained edge devices, such as IoT sensors and micro controller units (MCUs), tasked with executing tinyML inference. Notably, it outperforms strategies such as local inference execution, inference offloading to edge servers or cloud facilities, and split inference (i.e., inference execution distributed between two endpoints). Building upon the HI paradigm, this work explores different techniques aimed at further optimizing inference task execution. We propose and discuss three distinct HI approaches and evaluate their utility for image classification.
Jaya Prakash Champati、Adarsh Prasad Behera、Joerg Widmer、Roberto Morabito
微电子学、集成电路计算技术、计算机技术电子技术应用
Jaya Prakash Champati,Adarsh Prasad Behera,Joerg Widmer,Roberto Morabito.Improved Decision Module Selection for Hierarchical Inference in Resource-Constrained Edge Devices[EB/OL].(2024-04-08)[2025-05-26].https://arxiv.org/abs/2406.09424.点此复制
评论