TY - JOUR
T1 - Liquid State Machine on SpiNNaker for Spatio-Temporal Classification Tasks
AU - Patiño-Saucedo, Alberto
AU - Rostro-González, Horacio
AU - Serrano-Gotarredona, Teresa
AU - Linares-Barranco, Bernabé
N1 - Publisher Copyright:
Copyright © 2022 Patiño-Saucedo, Rostro-González, Serrano-Gotarredona and Linares-Barranco.
PY - 2022/3/14
Y1 - 2022/3/14
N2 - Liquid State Machines (LSMs) are computing reservoirs composed of recurrently connected Spiking Neural Networks which have attracted research interest for their modeling capacity of biological structures and as promising pattern recognition tools suitable for their implementation in neuromorphic processors, benefited from the modest use of computing resources in their training process. However, it has been difficult to optimize LSMs for solving complex tasks such as event-based computer vision and few implementations in large-scale neuromorphic processors have been attempted. In this work, we show that offline-trained LSMs implemented in the SpiNNaker neuromorphic processor are able to classify visual events, achieving state-of-the-art performance in the event-based N-MNIST dataset. The training of the readout layer is performed using a recent adaptation of back-propagation-through-time (BPTT) for SNNs, while the internal weights of the reservoir are kept static. Results show that mapping our LSM from a Deep Learning framework to SpiNNaker does not affect the performance of the classification task. Additionally, we show that weight quantization, which substantially reduces the memory footprint of the LSM, has a small impact on its performance.
AB - Liquid State Machines (LSMs) are computing reservoirs composed of recurrently connected Spiking Neural Networks which have attracted research interest for their modeling capacity of biological structures and as promising pattern recognition tools suitable for their implementation in neuromorphic processors, benefited from the modest use of computing resources in their training process. However, it has been difficult to optimize LSMs for solving complex tasks such as event-based computer vision and few implementations in large-scale neuromorphic processors have been attempted. In this work, we show that offline-trained LSMs implemented in the SpiNNaker neuromorphic processor are able to classify visual events, achieving state-of-the-art performance in the event-based N-MNIST dataset. The training of the readout layer is performed using a recent adaptation of back-propagation-through-time (BPTT) for SNNs, while the internal weights of the reservoir are kept static. Results show that mapping our LSM from a Deep Learning framework to SpiNNaker does not affect the performance of the classification task. Additionally, we show that weight quantization, which substantially reduces the memory footprint of the LSM, has a small impact on its performance.
KW - Liquid State Machine
KW - N-MNIST
KW - neuromorphic hardware
KW - spiking neural network
KW - SpiNNaker
UR - http://www.scopus.com/inward/record.url?scp=85127564100&partnerID=8YFLogxK
UR - https://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=pure_univeritat_ramon_llull&SrcAuth=WosAPI&KeyUT=WOS:000778559600001&DestLinkType=FullRecord&DestApp=WOS_CPL
U2 - 10.3389/fnins.2022.819063
DO - 10.3389/fnins.2022.819063
M3 - Article
C2 - 35360182
AN - SCOPUS:85127564100
SN - 1662-4548
VL - 16
JO - Frontiers in Neuroscience
JF - Frontiers in Neuroscience
M1 - 819063
ER -