Current eXplainable Artificial Intelligence (XAI) techniques assist individuals in interpreting AI recommendations. However, research primarily focuses on assessing users’ comprehension of explanations, neglecting important factors influencing decision support, such as whether the explanation uses the correct reasoning style to help the user understand the AI’s advice. In the last two years, our research aimed to fill this gap by examining the effects of factors such as user uncertainty, AI correctness, and the interplay between AI confidence and explanation logic styles in classification tasks. In this paper, we summarise the lesson learnt from this research and discuss its impact on the engineering of AI-based decision support systems.

Explaining Through the Right Reasoning Style: Lessons Learnt

Spano L. D.;Cau F. M.
2024-01-01

Abstract

Current eXplainable Artificial Intelligence (XAI) techniques assist individuals in interpreting AI recommendations. However, research primarily focuses on assessing users’ comprehension of explanations, neglecting important factors influencing decision support, such as whether the explanation uses the correct reasoning style to help the user understand the AI’s advice. In the last two years, our research aimed to fill this gap by examining the effects of factors such as user uncertainty, AI correctness, and the interplay between AI confidence and explanation logic styles in classification tasks. In this paper, we summarise the lesson learnt from this research and discuss its impact on the engineering of AI-based decision support systems.
2024
9783031592348
9783031592355
Abductive
AI correctness
AI uncertainty
Deductive
Explainable AI
Explanations
Inductive
Logical Reasoning
User uncertainty
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11584/426704
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact