In-Network Flow Classification with Knowledge Distillation

Kate Ching Ju Lin*, Chen Yang Li

*此作品的通信作者

研究成果: Article同行評審

摘要

Recent research has incorporated machine learning with software-defined networking to support intelligent traffic engineering. However, most frameworks only enable machine learning in remote controllers, which introduce significant signaling overhead and data forwarding costs. In this work, we present a new architecture called in-network inference (INI) to realize local learning in Neural Compute Stick (NCS), a portable device that can be connected to a programmable switch via a USB port. While NCS can flexibly extend the computing power of a switch, its limited capacity however cannot afford real-time inference for enormous traffic demands. To develop a practical local learning architecture, we design a two-phase learning framework that combines local learning with knowledge distillation and remote learning to achieve lightweight but accurate traffic classification. We further design an inference model deployment and adaptation algorithm to utilize multiple NCS devices equipped with different switches to share the inference workload of a network. Our testbed experiments show that the two-phase learning framework reduces the inference rejection rate by 46.5% and maintains the inference accuracy of 98.10%. The trace-driven simulations verify that the proposed adaptive model placement scheme considers load balancing and, hence, better utilizes the computing resources of NCS to serve dynamic inference requests.

原文English
文章編號9496643
頁(從 - 到)111879-111889
頁數11
期刊IEEE Access
9
DOIs
出版狀態Published - 2021

指紋

深入研究「In-Network Flow Classification with Knowledge Distillation」主題。共同形成了獨特的指紋。

引用此