🤖 AI Summary
Significant heterogeneity in granularity and structure between IoT sensor data and business process event data impedes their effective integration in process mining; existing integration models are fragmented, hindering data sharing and cross-organizational collaboration. Method: We propose OC-IoT-Log—the first lightweight, object-centric core metamodel for IoT-enhanced event logs—designed from cross-domain common requirements to uniformly represent fine-grained sensor observations and traditional process events. Built via metamodeling and implemented as a Python prototype, it is empirically validated across manufacturing and healthcare use cases. Contribution/Results: OC-IoT-Log substantially improves interoperability between IoT and process data, enables reproducible, cross-scenario process mining experiments, and provides a foundational framework for standardization and collaborative research in IoT-aware process analytics.
📝 Abstract
Advances in Internet-of-Things (IoT) technologies have prompted the integration of IoT devices with business processes (BPs) in many organizations across various sectors, such as manufacturing, healthcare and smart spaces. The proliferation of IoT devices leads to the generation of large amounts of IoT data providing a window on the physical context of BPs, which facilitates the discovery of new insights about BPs using process mining (PM) techniques. However, to achieve these benefits, IoT data need to be combined with traditional process (event) data, which is challenging due to the very different characteristics of IoT and process data, for instance in terms of granularity levels. Recently, several data models were proposed to integrate IoT data with process data, each focusing on different aspects of data integration based on different assumptions and requirements. This fragmentation hampers data exchange and collaboration in the field of PM, e.g., making it tedious for researchers to share data. In this paper, we present a core model synthesizing the most important features of existing data models. As the core model is based on common requirements, it greatly facilitates data sharing and collaboration in the field. A prototypical Python implementation is used to evaluate the model against various use cases and demonstrate that it satisfies these common requirements.