In this paper, a new algorithm for the training of Locally Recurrent Neural Networks (LRNNs) is presented, which aims to reduce computational complexity and at the same time guarantee the stability of the network during the training. The main feature of the proposed algorithm is the capability to represent the gradient of the error in an explicit form. The algorithm builds on the interpretation of Fibonacci's sequence as the output of an IIR second-order filter, which makes it possible to use Binet's formula that allows the generic terms of the sequence to be calculated directly. Thanks to this approach, the gradient of the loss function during the training can be explicitly calculated, and it can be expressed in terms of the parameters, which control the stability of the neural network.

A Training Algorithm for Locally Recurrent Neural Networks Based on the Explicit Gradient of the Loss Function

Carcangiu S.
Primo
;
Montisci A.
Ultimo
2025-01-01

Abstract

In this paper, a new algorithm for the training of Locally Recurrent Neural Networks (LRNNs) is presented, which aims to reduce computational complexity and at the same time guarantee the stability of the network during the training. The main feature of the proposed algorithm is the capability to represent the gradient of the error in an explicit form. The algorithm builds on the interpretation of Fibonacci's sequence as the output of an IIR second-order filter, which makes it possible to use Binet's formula that allows the generic terms of the sequence to be calculated directly. Thanks to this approach, the gradient of the loss function during the training can be explicitly calculated, and it can be expressed in terms of the parameters, which control the stability of the neural network.
2025
Closed-form error gradient; Dynamic systems; Gradient-based training; Locally recurrent neural networks
File in questo prodotto:
File Dimensione Formato  
algorithms-18-00104.pdf

accesso aperto

Tipologia: versione editoriale (VoR)
Dimensione 1.24 MB
Formato Adobe PDF
1.24 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11584/447845
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact