Background and aims: Artificial intelligence (AI) is increasing its role in diagnosis of patients with suspicious coronary artery disease. The aim of this manuscript is to develop a deep convolutional neural network (CNN) to classify coronary computed tomography angiography (CCTA) in the correct Coronary Artery Disease Reporting and Data System (CAD-RADS) category. Methods: Two hundred eighty eight patients who underwent clinically indicated CCTA were included in this single-center retrospective study. The CCTAs were stratified by CAD-RADS scores by expert readers and considered as reference standard. A deep CNN was designed and tested on the CCTA dataset and compared to on-site reading. The deep CNN analyzed the diagnostic accuracy of the following three Models based on CAD-RADS classification: Model A (CAD-RADS 0 vs CAD-RADS 1–2 vs CAD-RADS 3,4,5), Model 1 (CAD-RADS 0 vs CAD-RADS>0), Model 2 (CAD-RADS 0–2 vs CAD-RADS 3–5). Time of analysis for both physicians and CNN were recorded. Results: Model A showed a sensitivity, specificity, negative predictive value, positive predictive value and accuracy of 47%, 74%, 77%, 46% and 60%, respectively. Model 1 showed a sensitivity, specificity, negative predictive value, positive predictive value and accuracy of 66%, 91%, 92%, 63%, 86%, respectively. Conversely, Model 2 demonstrated the following sensitivity, specificity, negative predictive value, positive predictive value and accuracy: 82%, 58%, 74%, 69%, 71%, respectively. Time of analysis was significantly lower using CNN as compared to on-site reading (530.5 ± 179.1 vs 104.3 ± 1.4 sec, p=0.01) Conclusions: Deep CNN yielded accurate automated classification of patients with CAD-RADS.

Performance of a deep learning algorithm for the evaluation of CAD-RADS classification with CCTA

Palmisano V.;Saba L.;
2020-01-01

Abstract

Background and aims: Artificial intelligence (AI) is increasing its role in diagnosis of patients with suspicious coronary artery disease. The aim of this manuscript is to develop a deep convolutional neural network (CNN) to classify coronary computed tomography angiography (CCTA) in the correct Coronary Artery Disease Reporting and Data System (CAD-RADS) category. Methods: Two hundred eighty eight patients who underwent clinically indicated CCTA were included in this single-center retrospective study. The CCTAs were stratified by CAD-RADS scores by expert readers and considered as reference standard. A deep CNN was designed and tested on the CCTA dataset and compared to on-site reading. The deep CNN analyzed the diagnostic accuracy of the following three Models based on CAD-RADS classification: Model A (CAD-RADS 0 vs CAD-RADS 1–2 vs CAD-RADS 3,4,5), Model 1 (CAD-RADS 0 vs CAD-RADS>0), Model 2 (CAD-RADS 0–2 vs CAD-RADS 3–5). Time of analysis for both physicians and CNN were recorded. Results: Model A showed a sensitivity, specificity, negative predictive value, positive predictive value and accuracy of 47%, 74%, 77%, 46% and 60%, respectively. Model 1 showed a sensitivity, specificity, negative predictive value, positive predictive value and accuracy of 66%, 91%, 92%, 63%, 86%, respectively. Conversely, Model 2 demonstrated the following sensitivity, specificity, negative predictive value, positive predictive value and accuracy: 82%, 58%, 74%, 69%, 71%, respectively. Time of analysis was significantly lower using CNN as compared to on-site reading (530.5 ± 179.1 vs 104.3 ± 1.4 sec, p=0.01) Conclusions: Deep CNN yielded accurate automated classification of patients with CAD-RADS.
2020
Artificial intelligence; CADRADS; Convolutional neural network; Coronary artery disease; Plaque characterization
File in questo prodotto:
File Dimensione Formato  
10.1016@j.atherosclerosis.2019.12.001.pdf

accesso aperto

Tipologia: versione pre-print
Dimensione 10.17 MB
Formato Adobe PDF
10.17 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11584/283798
Citazioni
  • ???jsp.display-item.citation.pmc??? 28
  • Scopus 67
  • ???jsp.display-item.citation.isi??? 51
social impact