Evaluating the adversarial robustness of machine-learning models using gradient-based attacks is challenging. In this work, we show that hyperparameter optimization can improve fast minimum-norm attacks by automating the selection of the loss function, the optimizer, and the step-size scheduler, along with the corresponding hyperparameters. Our extensive evaluation involving several robust models demonstrates the improved efficacy of fast minimum-norm attacks when hyped up with hyperparameter optimization. We release our open-source code at https://github.com/pralab/HO-FMN.

Improving Fast Minimum-Norm Attacks with Hyperparameter Optimization

Giuseppe Floris
Primo
;
Raffaele Mura;Luca Scionis;Giorgio Piras
;
Maura Pintor;Ambra Demontis
Penultimo
;
Battista Biggio
Ultimo
2023-01-01

Abstract

Evaluating the adversarial robustness of machine-learning models using gradient-based attacks is challenging. In this work, we show that hyperparameter optimization can improve fast minimum-norm attacks by automating the selection of the loss function, the optimizer, and the step-size scheduler, along with the corresponding hyperparameters. Our extensive evaluation involving several robust models demonstrates the improved efficacy of fast minimum-norm attacks when hyped up with hyperparameter optimization. We release our open-source code at https://github.com/pralab/HO-FMN.
2023
978-2-87587-088-9
Machine Learning; Adversarial Machine Learning; Optimization
File in questo prodotto:
File Dimensione Formato  
ES2023-164 (1).pdf

Solo gestori archivio

Tipologia: versione editoriale (VoR)
Dimensione 1.69 MB
Formato Adobe PDF
1.69 MB Adobe PDF   Visualizza/Apri   Richiedi una copia
2310.08177.pdf

accesso aperto

Tipologia: versione pre-print
Dimensione 443.53 kB
Formato Adobe PDF
443.53 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11584/381983
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact