Input Selection for Bandwidth-Limited Neural Network Inference

Oehmcke, Stefan; Gieseke, Fabian


Abstract

Data are often accommodated on centralized storage servers. This is the case, for instance, in remote sensing and astronomy, where projects produce several petabytes of data every year. While machine learning models are often trained on relatively small subsets of the data, the inference phase typically requires transferring significant amounts of data between the servers and the clients. In many cases, the bandwidth available per user is limited, which then renders the data transfer to be one of the major bottlenecks. In this work, we propose a framework that automatically selects the relevant parts of the input data for a given neural network. The model as well as the associated selection masks are trained simultaneously such that a good model performance is achieved while only a minimal amount of data is selected. During the inference phase, only those parts of the data have to be transferred between the server and the client. We propose both instance-independent and instance-dependent selection masks. The former ones are the same for all instances to be transferred, whereas the latter ones allow for variable transfer sizes per instance. Our experiments show that it is often possible to significantly reduce the amount of data needed to be transferred without affecting the model quality much.

Keywords
deep learning; big data; satellite data



Publication type
Research article in proceedings (conference)

Peer reviewed
Yes

Publication status
Published

Year
2022

Conference
SIAM International Conference on Data Mining (SDM)

Venue
Alexandria, Virginia

Book title
Proceedings of the 2022 SIAM International Conference on Data Mining (SDM)

Editor
Banerjee, Arindam; Zhou, Zhi-Hua; Papalexakis, Evangelos E.; Riondato, Matteo

Start page
280

End page
288

Publisher
Society for Industrial and Applied Mathematics (SIAM)

Place
USA

Language
English

DOI