Spiking Neural Networks (SNNs) are computational efficient algorithms that have proven to be very effective on edge applications. Recent efforts of the scientific community have investigated the integration of the attention layer into Spiking Neural Networks to close the accuracy gap with non-spiking approaches. As effective, hardware implementations of spiking transformers have remained limited to high-end FPGA platforms so far, with power requirements incompatible for low-power applications. This work introduces ESTU, a compact spiking transformer hardware architecture tailored for edge deployment. ESTU has been designed with resource minimization in mind in order to fit in power and resource constrained FPGAs, as the Lattice iCE40UP5K. We validated ESTU on available Spiking Transformer models at the State of the Art and on a sEMG classification application addressing NinaPro DB-5, demonstrating improved accuracy of over 2% compared to the results of a vanilla SNN, fully unlocking spiking transformers capability at the edge at the cost of 3.76 mW during inference.

ESTU: Enabling Spiking Transformers on Ultra-Low-Power FPGAs

Leone, Gianluca
;
Busia, Paola;Orrù, Mauro;Raffo, Luigi;Meloni, Paolo
2025-01-01

Abstract

Spiking Neural Networks (SNNs) are computational efficient algorithms that have proven to be very effective on edge applications. Recent efforts of the scientific community have investigated the integration of the attention layer into Spiking Neural Networks to close the accuracy gap with non-spiking approaches. As effective, hardware implementations of spiking transformers have remained limited to high-end FPGA platforms so far, with power requirements incompatible for low-power applications. This work introduces ESTU, a compact spiking transformer hardware architecture tailored for edge deployment. ESTU has been designed with resource minimization in mind in order to fit in power and resource constrained FPGAs, as the Lattice iCE40UP5K. We validated ESTU on available Spiking Transformer models at the State of the Art and on a sEMG classification application addressing NinaPro DB-5, demonstrating improved accuracy of over 2% compared to the results of a vanilla SNN, fully unlocking spiking transformers capability at the edge at the cost of 3.76 mW during inference.
2025
Edge AI
FPGA
low power
Spiking Neural Networks
Spiking Transformers
Tiny Transformers
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11584/460645
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact