Quantized Reservoir Computing for Spectrum Sensing with Knowledge Distillation


TR Number



Journal Title

Journal ISSN

Volume Title




Quantization has been widely used to compress machine learning models for deployments on field-programmable gate array (FPGA). However, quantization often degrades the accuracy of a model. In this work, we introduce a quantization approach to reduce the computation/storage resource consumption of a model without losing much accuracy. Spectrum sensing is a technique to identify the idle/busy bandwidths in cognitive radio. The spectrum occupancy of each bandwidth maintains a temporal correlation with previous and future time slots. A recurrent neural network (RNN) is very suitable for spectrum sensing. Reservoir computing (RC) is a computation framework derived from the theory of RNNs. It is a better choice than RNN for spectrum sensing on FPGA because it is easier to train and requires fewer computation resources. We apply our quantization approach to the RC to reduce the resource consumption on FPGA. A knowledge distillation called teacher-student mutual learning is proposed for the quantized RC to minimize quantization errors. The teacher-student mutual learning resolves the mismatched capacity issue of conventional knowledge distillation and enables knowledge distillation on small datasets. On the spectrum sensing dataset, the quantized RC trained with the teacher-student mutual learning achieves comparable accuracy and reduces the resource utilization of digital signal processing (DSP) blocks, flip-flop (FF), and Lookup table (LUT) by 53%, 40%, and 35%, respectively compared to the RNN. The inference speed of the quantized RC is 2.4 times faster. The teacher-student mutual learning improves the accuracy of the quantized RC by 2.39%, which is better than the conventional knowledge distillation.



Quantization, Reservoir computing, Spectrum sensing, Model compression, Knowledge distillation, Cognitive radio