Reservoir Computing: Foundations, Advances, and Challenges Toward Neuromorphic Intelligence

Loading...
Thumbnail Image

Files

TR Number

Date

2026-02-13

Journal Title

Journal ISSN

Volume Title

Publisher

MDPI

Abstract

Reservoir computing (RC) has emerged as an energy-efficient paradigm for temporal information processing, offering reduced training complexity by fixing recurrent dynamics and training only a simple readout layer. Among RC models, Echo State Networks (ESNs) and Liquid State Machines (LSMs) represent two distinct approaches based on continuous-valued and spiking neural dynamics, respectively. In this work, we present a comparative evaluation of ESNs and LSMs on the Mackey–Glass chaotic time-series prediction task, with emphasis on scalability, overfitting behavior, and robustness to reduced numerical error precision. Experimental results show that ESNs achieve lower prediction error with relatively small reservoirs but exhibit early performance saturation and signs of overfitting as reservoir size increases. In contrast, LSMs demonstrate more consistent generalization with increasing reservoir size and maintain stable performance under aggressive reservoir quantization. These findings highlight fundamental trade-offs between accuracy and hardware efficiency, and suggest that spiking RC models are well suited for energy-constrained and neuromorphic computing applications.

Description

Keywords

Citation

Liu, A.; Azmine, M.F.; Lin, C.; Yi, Y. Reservoir Computing: Foundations, Advances, and Challenges Toward Neuromorphic Intelligence. AI 2026, 7, 70.