The consumer-level devices that track the user's gestures eased the design and the implementation of interactive applications relying on body movements as input. Gesture recognition based on computer vision and machine-learning focus mainly on accuracy and robustness. The resulting classifiers label precisely gestures after their performance, but they do not provide intermediate information during the execution. Human-Computer Interaction research focused instead on providing an easy and effective guidance for performing and discovering interactive gestures. The compositional approaches developed for solving such problem provide information on both the whole gesture and on its sub-parts, but they exploit heuristic techniques that have a low recognition accuracy. In this paper, we introduce DEICTIC, a compositional and declarative description for stroke gestures, which uses basic Hidden Markov Models (HMMs) to recognise meaningful predefined primitives (gesture sub-parts) and it composes them to recognise complex gestures. It provides information for supporting gesture guidance and it reaches an accuracy comparable with state-of-the-art approaches, evaluated on two datasets from the literature. Through a developer evaluation, we show that the implementation of a guidance system with DEICTIC requires an effort comparable to compositional approaches, while the definition procedure and the perceived recognition accuracy is comparable to machine learning.

DEICTIC: a compositional and declarative gesture description based on hidden markov models

Carcangiu, Alessandro;Spano, Lucio Davide;Fumera, Giorgio;Roli, Fabio
2019-01-01

Abstract

The consumer-level devices that track the user's gestures eased the design and the implementation of interactive applications relying on body movements as input. Gesture recognition based on computer vision and machine-learning focus mainly on accuracy and robustness. The resulting classifiers label precisely gestures after their performance, but they do not provide intermediate information during the execution. Human-Computer Interaction research focused instead on providing an easy and effective guidance for performing and discovering interactive gestures. The compositional approaches developed for solving such problem provide information on both the whole gesture and on its sub-parts, but they exploit heuristic techniques that have a low recognition accuracy. In this paper, we introduce DEICTIC, a compositional and declarative description for stroke gestures, which uses basic Hidden Markov Models (HMMs) to recognise meaningful predefined primitives (gesture sub-parts) and it composes them to recognise complex gestures. It provides information for supporting gesture guidance and it reaches an accuracy comparable with state-of-the-art approaches, evaluated on two datasets from the literature. Through a developer evaluation, we show that the implementation of a guidance system with DEICTIC requires an effort comparable to compositional approaches, while the definition procedure and the perceived recognition accuracy is comparable to machine learning.
2019
Classification; Compositional gesture modelling; Declarative gesture modelling; Gestures; Hidden markov models; Software; Human Factors and Ergonomics; 3304; Engineering (all); Human-Computer Interaction; Hardware and Architecture
File in questo prodotto:
File Dimensione Formato  
Cargangiu et al._International Journal of Human Computer Studies_2019.pdf

Solo gestori archivio

Descrizione: articolo
Tipologia: versione editoriale (VoR)
Dimensione 4.62 MB
Formato Adobe PDF
4.62 MB Adobe PDF   Visualizza/Apri   Richiedi una copia
paper.pdf

accesso aperto

Descrizione: Articolo principale
Tipologia: versione post-print (AAM)
Dimensione 2.66 MB
Formato Adobe PDF
2.66 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11584/251572
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 11
  • ???jsp.display-item.citation.isi??? 6
social impact