With the advent of the deep learning era, Fingerprint-based Authentication Systems (FAS) equipped with Fingerprint Presentation Attack Detection (FPAD) modules managed to avoid attacks on the sensor through artificial replicas of fingerprints. Previous works highlighted the vulnerability of FPADs to digital adversarial attacks. However, in a realistic scenario, the attackers may not have the possibility to directly feed a digitally perturbed image to the deep learning based FPAD, since the channel between the sensor and the FPAD is usually protected. In this paper we thus investigate the threat level associated with adversarial attacks against FPADs in the physical domain. By materially realising fakes from the adversarial images we were able to insert them into the system directly from the “exposed” part, the sensor. To the best of our knowledge, this represents the first proof-of-concept of a fingerprint adversarial presentation attack. We evaluated how much liveness score changed by feeding the system with the attacks using digital and printed adversarial images. To measure what portion of this increase is due to the printing itself, we also re-printed the original spoof images, without injecting any perturbation. Experiments conducted on the LivDet 2015 dataset demonstrate that the printed adversarial images achieve ∼ 100% attack success rate against an FPAD if the attacker has the ability to make multiple attacks on the sensor (10) and a fairly good result (∼ 28%) in a one-shot scenario. Despite this work must be considered as a proof-of-concept, it constitutes a promising pioneering attempt confirming that an adversarial presentation attack is feasible and dangerous.

Fingerprint Adversarial Presentation Attack in the Physical Domain

Casula R.;Orru' G.;Marcialis G. L.;Sansone C.
2021-01-01

Abstract

With the advent of the deep learning era, Fingerprint-based Authentication Systems (FAS) equipped with Fingerprint Presentation Attack Detection (FPAD) modules managed to avoid attacks on the sensor through artificial replicas of fingerprints. Previous works highlighted the vulnerability of FPADs to digital adversarial attacks. However, in a realistic scenario, the attackers may not have the possibility to directly feed a digitally perturbed image to the deep learning based FPAD, since the channel between the sensor and the FPAD is usually protected. In this paper we thus investigate the threat level associated with adversarial attacks against FPADs in the physical domain. By materially realising fakes from the adversarial images we were able to insert them into the system directly from the “exposed” part, the sensor. To the best of our knowledge, this represents the first proof-of-concept of a fingerprint adversarial presentation attack. We evaluated how much liveness score changed by feeding the system with the attacks using digital and printed adversarial images. To measure what portion of this increase is due to the printing itself, we also re-printed the original spoof images, without injecting any perturbation. Experiments conducted on the LivDet 2015 dataset demonstrate that the printed adversarial images achieve ∼ 100% attack success rate against an FPAD if the attacker has the ability to make multiple attacks on the sensor (10) and a fairly good result (∼ 28%) in a one-shot scenario. Despite this work must be considered as a proof-of-concept, it constitutes a promising pioneering attempt confirming that an adversarial presentation attack is feasible and dangerous.
2021
978-3-030-68779-3
978-3-030-68780-9
Adversarial; FPAD; Liveness
File in questo prodotto:
File Dimensione Formato  
ICPR2020_AdvFinger.pdf

Open Access dal 26/02/2022

Descrizione: Paper
Tipologia: versione post-print
Dimensione 1.18 MB
Formato Adobe PDF
1.18 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11584/317323
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 8
  • ???jsp.display-item.citation.isi??? ND
social impact