Previous research on eXplainable Artificial Intelligence (XAI) in AI-assisted decision-making has shown mixed results in increasing users' accuracy while mitigating overreliance on AI. A promising yet underexplored strategy consists of providing AI assistance on-demand through explicit interaction. Preliminary results show that users with high Need for Cognition (NFC) benefit more from such a paradigm, though the effects predicted by similar cognitive measures require further investigation. In addition, hybrid approaches consisting of descriptive statistics on the training data (global data-centric) with model-centric explanations have shown the potential to mitigate overreliance while improving accuracy for experts and lay users in the health domain. However, the impact of this approach in other fields is still unknown.This paper investigates the effects of four on-demand explanation types - local model-centric, global data-centric, local/global model-centric, and hybrid - on users' accuracy and overreliance. We also assess how variations in Need for Cognition (NFC), Epistemic Curiosity (EC), and Curiosity and Exploration Inventory-II (CEI-II) impact these metrics and explore correlations among these traits.Our findings indicate no significant differences among on-demand explanations to improve accuracy or mitigate overreliance. The same holds for low and high NFC, EC, and CEI-II individuals, although we found moderate positive correlations among these psychometrics. Post-hoc analysis revealed that personality traits and the on-demand intervention influenced other decision-making behaviors more than the type of explanation provided. Users who requested on-demand assistance exhibited lower confidence, suggesting that seeking data or AI support may undermine self-confidence. Interestingly, individuals with higher NFC and CEI-II scores showed greater confidence, and those scoring higher on CEI-II requested AI assistance less frequently.We contribute to expanding the knowledge about XAI-assisted decision-making by providing practical guidelines for designing AI systems that account for individual cognitive traits and user confidence, helping to improve their effectiveness in decision-making tasks.

The Influence of Curiosity Traits and On-Demand Explanations in AI-Assisted Decision-Making

Cau, Federico Maria;Spano, Lucio Davide
2025-01-01

Abstract

Previous research on eXplainable Artificial Intelligence (XAI) in AI-assisted decision-making has shown mixed results in increasing users' accuracy while mitigating overreliance on AI. A promising yet underexplored strategy consists of providing AI assistance on-demand through explicit interaction. Preliminary results show that users with high Need for Cognition (NFC) benefit more from such a paradigm, though the effects predicted by similar cognitive measures require further investigation. In addition, hybrid approaches consisting of descriptive statistics on the training data (global data-centric) with model-centric explanations have shown the potential to mitigate overreliance while improving accuracy for experts and lay users in the health domain. However, the impact of this approach in other fields is still unknown.This paper investigates the effects of four on-demand explanation types - local model-centric, global data-centric, local/global model-centric, and hybrid - on users' accuracy and overreliance. We also assess how variations in Need for Cognition (NFC), Epistemic Curiosity (EC), and Curiosity and Exploration Inventory-II (CEI-II) impact these metrics and explore correlations among these traits.Our findings indicate no significant differences among on-demand explanations to improve accuracy or mitigate overreliance. The same holds for low and high NFC, EC, and CEI-II individuals, although we found moderate positive correlations among these psychometrics. Post-hoc analysis revealed that personality traits and the on-demand intervention influenced other decision-making behaviors more than the type of explanation provided. Users who requested on-demand assistance exhibited lower confidence, suggesting that seeking data or AI support may undermine self-confidence. Interestingly, individuals with higher NFC and CEI-II scores showed greater confidence, and those scoring higher on CEI-II requested AI assistance less frequently.We contribute to expanding the knowledge about XAI-assisted decision-making by providing practical guidelines for designing AI systems that account for individual cognitive traits and user confidence, helping to improve their effectiveness in decision-making tasks.
2025
Accuracy
AI confidence
AI-assisted decisions
Curiosity and Exploration Inventory II
Data-centric explanations
Epistemic Curiosity
Explainable AI
Job applicants data
Multifaceted explanations
Need for Cognition
Overreliance
File in questo prodotto:
File Dimensione Formato  
3708359.3712165.pdf

accesso aperto

Tipologia: versione editoriale (VoR)
Dimensione 2.29 MB
Formato Adobe PDF
2.29 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11584/459566
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact