Quantized Reservoir Computing for Spectrum Sensing with Knowledge Distillation

dc.contributor.authorLiu, Shiyaen
dc.contributor.authorLiu, Lingjiaen
dc.contributor.authorYi, Yangen
dc.date.accessioned2023-02-28T15:48:45Zen
dc.date.available2023-02-28T15:48:45Zen
dc.date.issued2021-12-28en
dc.date.updated2023-02-28T15:36:12Zen
dc.description.abstractQuantization has been widely used to compress machine learning models for deployments on field-programmable gate array (FPGA). However, quantization often degrades the accuracy of a model. In this work, we introduce a quantization approach to reduce the computation/storage resource consumption of a model without losing much accuracy. Spectrum sensing is a technique to identify the idle/busy bandwidths in cognitive radio. The spectrum occupancy of each bandwidth maintains a temporal correlation with previous and future time slots. A recurrent neural network (RNN) is very suitable for spectrum sensing. Reservoir computing (RC) is a computation framework derived from the theory of RNNs. It is a better choice than RNN for spectrum sensing on FPGA because it is easier to train and requires fewer computation resources. We apply our quantization approach to the RC to reduce the resource consumption on FPGA. A knowledge distillation called teacher-student mutual learning is proposed for the quantized RC to minimize quantization errors. The teacher-student mutual learning resolves the mismatched capacity issue of conventional knowledge distillation and enables knowledge distillation on small datasets. On the spectrum sensing dataset, the quantized RC trained with the teacher-student mutual learning achieves comparable accuracy and reduces the resource utilization of digital signal processing (DSP) blocks, flip-flop (FF), and Lookup table (LUT) by 53%, 40%, and 35%, respectively compared to the RNN. The inference speed of the quantized RC is 2.4 times faster. The teacher-student mutual learning improves the accuracy of the quantized RC by 2.39%, which is better than the conventional knowledge distillation.en
dc.description.versionAccepted versionen
dc.format.mimetypeapplication/pdfen
dc.identifier.doihttps://doi.org/10.1109/TCDS.2022.3147789en
dc.identifier.eissn2379-8939en
dc.identifier.issn2379-8920en
dc.identifier.issue99en
dc.identifier.orcidYi, Yang [0000-0002-1354-0204]en
dc.identifier.orcidLiu, Lingjia [0000-0003-1915-1784]en
dc.identifier.urihttp://hdl.handle.net/10919/114010en
dc.language.isoenen
dc.publisherIEEEen
dc.rightsIn Copyrighten
dc.rights.urihttp://rightsstatements.org/vocab/InC/1.0/en
dc.subjectQuantizationen
dc.subjectReservoir computingen
dc.subjectSpectrum sensingen
dc.subjectModel compressionen
dc.subjectKnowledge distillationen
dc.subjectCognitive radioen
dc.titleQuantized Reservoir Computing for Spectrum Sensing with Knowledge Distillationen
dc.title.serialIEEE Transactions on Cognitive and Developmental Systemsen
dc.typeArticle - Refereeden
dc.type.dcmitypeTexten
dc.type.otherArticleen
pubs.organisational-group/Virginia Techen
pubs.organisational-group/Virginia Tech/All T&R Facultyen
pubs.organisational-group/Virginia Tech/Innovation Campusen

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Quantized_Reservoir_Computing_for_Spectrum_Sensing_with_Knowledge_Distillation.pdf
Size:
752.07 KB
Format:
Adobe Portable Document Format
Description:
Accepted version