Toward LLM-enabled Business Process Coherence Checking Based on Multi-level Process Documentation

Schulte, Marek; Franzoi, Sandro; Köhne, Frank; vom Brocke, Jan


Zusammenfassung

In this paper, aProCheCk, an Autonomous Process Coherence Checking method, is developed. aProCheCk leverages large language models (LLMs) to enhance the coherence checking of multi-level process documentation within business process management (BPM). This research addresses the need for automated ways of managing incoherencies in process documentation. The development of the artifact was guided by a design science research approach, which involved iterative development and refinement. This was achieved through expert interviews with researchers and practitioners, iterative experimental benchmarking, and focus group validation based on demonstrations of a prototypical implementation with naturalistic data from diverse industries. aProCheCk can dynamically analyze and assess changes in BPM documentation, detect incoherencies, and provide actionable insights for maintaining process coherence. The findings reveal significant potential for improving operational efficiency, reducing manual effort, and detecting negative and positive process variation early to support continuous process innovation. This research contributes to the field of BPM by integrating LLMs into the BPM lifecycle, enhancing generative AI-based applications within BPM practices, introducing the Business Process Change Classification Framework, and providing an open-source dataset that can serve as a foundation for future research and development.

Schlüsselwörter
Large language model; Process coherence; Artificial intelligence; Process deviance; Business process management



Publikationstyp
Forschungsartikel (Zeitschrift)

Begutachtet
Ja

Publikationsstatus
Veröffentlicht

Jahr
2025

Fachzeitschrift
Process Science

Band
2

Ausgabe
22

ISSN
2948-2178

DOI

Gesamter Text