Boosted Trees on a Diet: Compact Models for Resource-Constrained Devices

Herrmann, Nina; Stenkamp, Jan; Karic, Benjamin; Oehmke, Stefan; Gieseke, Fabian

Abstract

Deploying machine learning models on compute-constrained devices has become a key building block of modern IoT applications. In this work, we present a compression scheme for boosted decision trees, addressing the growing need for lightweight machine learning models. Specifically, we provide techniques for training compact boosted decision tree ensembles that exhibit a reduced memory footprint by rewarding, among other things, the reuse of features and thresholds during training. Our experimental evaluation shows that models achieved the same performance with a compression ratio of 4–16x compared to LightGBM models using an adapted training process and an alternative memory layout. Once deployed, the corresponding IoT devices can operate independently of constant communication or external energy supply, and, thus, autonomously, requiring only minimal computing power and energy. This capability opens the door to a wide range of IoT applications, including remote monitoring, edge analytics, and real-time decision making in isolated or power-limited environments.

Keywords

TinyML, Boosting, Decision Trees, Microcontrollers, IoT

Cite as

Herrmann, N., Stenkamp, J., Karic, B., Oehmke, S., & Gieseke, F. (2026). Boosted Trees on a Diet: Compact Models for Resource-Constrained Devices. (accepted / in press (not yet published))

Details

Publication type
Research article in digital collection (conference)

Peer reviewed
Yes

Publication status
accepted / in press (not yet published)

Year
2026

Conference
The Fourteenth International Conference on Learning Representations

Venue
Rio de Janeiro

Book title
The Fourteenth International Conference on Learning Representations

Editor
ICLR

Language
English

Full text