We identify tessellation-filtering ReLU neural networks that, when composed with another ReLU network, keep its non-redundant tessellation unchanged or reduce it. The additional network complexity modifies the shape of the decision surface without increasing the number of linear regions. We provide a mathematical understanding of the related additional expressiveness by means of a novel measure of shape complexity by counting deviations from convexity which results in a Boolean algebraic characterization of this special class. A local representation theorem gives rise to novel approaches for pruning and decision surface analysis.
Tessellation-Filtering ReLU Neural Networks
Biggio, Battista;
2022-01-01
Abstract
We identify tessellation-filtering ReLU neural networks that, when composed with another ReLU network, keep its non-redundant tessellation unchanged or reduce it. The additional network complexity modifies the shape of the decision surface without increasing the number of linear regions. We provide a mathematical understanding of the related additional expressiveness by means of a novel measure of shape complexity by counting deviations from convexity which results in a Boolean algebraic characterization of this special class. A local representation theorem gives rise to novel approaches for pruning and decision surface analysis.File | Dimensione | Formato | |
---|---|---|---|
0463.pdf
Solo gestori archivio
Descrizione: paper online
Tipologia:
versione editoriale (VoR)
Dimensione
319.8 kB
Formato
Adobe PDF
|
319.8 kB | Adobe PDF | Visualizza/Apri Richiedi una copia |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.