Magnitude and Uncertainty Pruning Criterion for Neural Networks

Ko, Vinnie; Oehmcke, Stefan; Gieseke Fabian


Zusammenfassung

Neural networks have achieved dramatic improvements in recent years and depict the state-of-the-art methods for many real-world tasks nowadays. One drawback is, however, that many of these models are overparameterized, which makes them both computationally and memory intensive. Furthermore, overparameterization can also lead to undesired overfitting side-effects. Inspired by recently proposed magnitude-based pruning schemes and the Wald test from the field of statistics, we introduce a novel magnitude and uncertainty (M&U) pruning criterion that helps to lessen such shortcomings. One important advantage of our M&U pruning criterion is that it is scale-invariant, a phenomenon that the magnitude-based pruning criterion suffers from. In addition, we present a ``pseudo bootstrap'' scheme, which can efficiently estimate the uncertainty of the weights by using their update information during training. Our experimental evaluation, which is based on various neural network architectures and datasets, shows that our new criterion leads to more compressed models compared to models that are solely based on magnitude-based pruning criteria, with, at the same time, less loss in predictive power.

Schlüsselwörter
neural networks; pruning



Publikationstyp
Forschungsartikel in Sammelband (Konferenz)

Begutachtet
Ja

Publikationsstatus
Veröffentlicht

Jahr
2019

Konferenz
IEEE Big Data, Intelligent Data Mining Special Session

Konferenzort
Los Angeles

Buchtitel
2019 {IEEE} International Conference on Big Data {(IEEE} BigData)

Herausgeber
Baru, Chaitanya K.; Huan, Jun; Khan, Latifur; Hu, Xiaohua; Ak, Ronay; Tian, Yuanyuan; Barga, Roger S.; Zaniolo, Carlo; Lee, Kisung; Ye, Yanfang (Fanny)

Erste Seite
2317

Letzte Seite
2326

Verlag
IEEE

Ort
Los Angeles, USA

Sprache
Englisch

DOI

Gesamter Text