The ability to recognize fine-grained gestures enables several applications in different domains, including healthcare, robotics, remote control, and human-computer interaction. Traditional gesture recognition systems rely on data acquired from cameras, depth sensors, or smart gloves. More recently, techniques for recognizing gestures based on signals acquired by high-density (HD) EMG electrodes worn on the forearm have been proposed. An advantage of these techniques is that they do not rely on the use of external devices, and they are feasible also to people who underwent amputation. Unfortunately, the extraction of complex features from raw HD EMG signals may introduce delays that deter the real-time requirements of the system. To address this issue, in a preliminary investigation we proposed to use graph neural networks for gesture recognition from raw HD EMG data. In this paper, we extend our previous work by exploiting Explainable AI algorithms to automatically refine the graph topology based on the data in order to improve recognition rates and reduce the computational cost. We performed extensive experiments with a large dataset collected from 20 volunteers regarding the execution of 65 fine-grained gestures, comparing our technique with state-of-the-art methods based on handcrafted features and different machine learning algorithms. Experimental results show that our technique outperforms the state of the art in terms of recognition performance while incurring significantly lower computational cost at run-time.

Explainable AI-powered Graph Neural Networks for HD EMG-Based Gesture Intention Recognition

Massa S. M.
;
Riboni D.
;
2023-01-01

Abstract

The ability to recognize fine-grained gestures enables several applications in different domains, including healthcare, robotics, remote control, and human-computer interaction. Traditional gesture recognition systems rely on data acquired from cameras, depth sensors, or smart gloves. More recently, techniques for recognizing gestures based on signals acquired by high-density (HD) EMG electrodes worn on the forearm have been proposed. An advantage of these techniques is that they do not rely on the use of external devices, and they are feasible also to people who underwent amputation. Unfortunately, the extraction of complex features from raw HD EMG signals may introduce delays that deter the real-time requirements of the system. To address this issue, in a preliminary investigation we proposed to use graph neural networks for gesture recognition from raw HD EMG data. In this paper, we extend our previous work by exploiting Explainable AI algorithms to automatically refine the graph topology based on the data in order to improve recognition rates and reduce the computational cost. We performed extensive experiments with a large dataset collected from 20 volunteers regarding the execution of 65 fine-grained gestures, comparing our technique with state-of-the-art methods based on handcrafted features and different machine learning algorithms. Experimental results show that our technique outperforms the state of the art in terms of recognition performance while incurring significantly lower computational cost at run-time.
2023
Pervasive healthcare; gesture recognition; prosthetic arm control; HD EMG sensor data; GNN; deep learning
File in questo prodotto:
File Dimensione Formato  
Explainable_AI-powered_Graph_Neural_Networks_for_H.pdf

accesso aperto

Descrizione: Early Access
Tipologia: versione editoriale (VoR)
Dimensione 1.51 MB
Formato Adobe PDF
1.51 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11584/390610
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 0
social impact