We identify tessellation-filtering ReLU neural networks that, when composed with another ReLU network, keep its non-redundant tessellation unchanged or reduce it. The additional network complexity modifies the shape of the decision surface without increasing the number of linear regions. We provide a mathematical understanding of the related additional expressiveness by means of a novel measure of shape complexity by counting deviations from convexity which results in a Boolean algebraic characterization of this special class. A local representation theorem gives rise to novel approaches for pruning and decision surface analysis.

Tessellation-Filtering ReLU Neural Networks

Biggio, Battista;
2022-01-01

Abstract

We identify tessellation-filtering ReLU neural networks that, when composed with another ReLU network, keep its non-redundant tessellation unchanged or reduce it. The additional network complexity modifies the shape of the decision surface without increasing the number of linear regions. We provide a mathematical understanding of the related additional expressiveness by means of a novel measure of shape complexity by counting deviations from convexity which results in a Boolean algebraic characterization of this special class. A local representation theorem gives rise to novel approaches for pruning and decision surface analysis.
2022
978-1-956792-00-3
Machine Learning Theory; Machine Learning, Theory of Deep Learning
File in questo prodotto:
File Dimensione Formato  
0463.pdf

Solo gestori archivio

Descrizione: paper online
Tipologia: versione editoriale (VoR)
Dimensione 319.8 kB
Formato Adobe PDF
319.8 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11584/349285
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? ND
social impact