Adversarial machine learning (AML) is a recent research field that investigates potential security issues related to the use of machine learning (ML) algorithms in modern artificial intelligence (AI)-based systems, along with defensive techniques to protect ML algorithms against such threats. The main threats against ML encompass a set of techniques that aim to mislead ML models through adversarial input perturbations. Unlike ML-enabled crimes, in which ML is used for malicious and offensive purposes, and ML-enabled security mechanisms, in which ML is used for securing existing systems, AML techniques exploit and specifically address the security vulnerabilities of ML algorithms.

Adversarial Machine Learning: Attacks From Laboratories to the Real World

Biggio, Battista
2021-01-01

Abstract

Adversarial machine learning (AML) is a recent research field that investigates potential security issues related to the use of machine learning (ML) algorithms in modern artificial intelligence (AI)-based systems, along with defensive techniques to protect ML algorithms against such threats. The main threats against ML encompass a set of techniques that aim to mislead ML models through adversarial input perturbations. Unlike ML-enabled crimes, in which ML is used for malicious and offensive purposes, and ML-enabled security mechanisms, in which ML is used for securing existing systems, AML techniques exploit and specifically address the security vulnerabilities of ML algorithms.
2021
Adversarial machine learning; Data models; Training data; Biological system modeling
File in questo prodotto:
File Dimensione Formato  
biggio21-IEEEComp.pdf

Solo gestori archivio

Descrizione: articolo online
Tipologia: versione editoriale (VoR)
Dimensione 552.02 kB
Formato Adobe PDF
552.02 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11584/313634
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 18
  • ???jsp.display-item.citation.isi??? 12
social impact