Input Selection for Bandwidth-Limited Neural Network Inference

Oehmcke, Stefan; Gieseke, Fabian


Zusammenfassung

Data are often accommodated on centralized storage servers. This is the case, for instance, in remote sensing and astronomy, where projects produce several petabytes of data every year. While machine learning models are often trained on relatively small subsets of the data, the inference phase typically requires transferring significant amounts of data between the servers and the clients. In many cases, the bandwidth available per user is limited, which then renders the data transfer to be one of the major bottlenecks. In this work, we propose a framework that automatically selects the relevant parts of the input data for a given neural network. The model as well as the associated selection masks are trained simultaneously such that a good model performance is achieved while only a minimal amount of data is selected. During the inference phase, only those parts of the data have to be transferred between the server and the client. We propose both instance-independent and instance-dependent selection masks. The former ones are the same for all instances to be transferred, whereas the latter ones allow for variable transfer sizes per instance. Our experiments show that it is often possible to significantly reduce the amount of data needed to be transferred without affecting the model quality much.

Schlüsselwörter
deep learning; big data; satellite data



Publikationstyp
Forschungsartikel in Sammelband (Konferenz)

Begutachtet
Ja

Publikationsstatus
Veröffentlicht

Jahr
2022

Konferenz
SIAM International Conference on Data Mining (SDM)

Konferenzort
Alexandria, Virginia

Buchtitel
Proceedings of the 2022 SIAM International Conference on Data Mining (SDM)

Herausgeber
Banerjee, Arindam; Zhou, Zhi-Hua; Papalexakis, Evangelos E.; Riondato, Matteo

Erste Seite
280

Letzte Seite
288

Verlag
Society for Industrial and Applied Mathematics (SIAM)

Ort
USA

Sprache
Englisch

DOI