Cost-efficient authoring of large interactive industrial objects and dynamic environments for XR consumption (Research)
Numerous industrial application cases exist for Augmented Reality (AR) and Virtual Reality (VR) - jointly denoted as eXtended Reality (XR). All these application cases have in common that they require (graphical) XR representations of industrial products and even integral production environments. This requirement is prototypically tackled via manual modelling (e.g., via CAD software or by using a game engine), which is an expensive solution due to the high amount of human labor involved. Additionally, even if technical drawings like CAD exist, they often fail to yield optimal graphical appearances in XR (e.g., unrealistic lighting). We argue that XR content authoring costs raise an adoption barrier by deterring manufacturing companies as well as industrial technology/service providers from maximally exploiting XR technology. XRtwin_SBO will therefore research scanning-based solutions that enable cost-efficient XR content acquisition directly from the real world (i.e., with minimal manual modelling or processing effort involved). The focus hereby will be on the capturing of large industrial objects or spaces that are interactive and dynamic (current state-of-use is chiefly concerned with small and/or static objects). The resulting XR representations will be scalable in terms of not only graphical quality but also physics simulation accuracy, to warrant consumption on heterogeneous client devices with varying capabilities ("capture once, consume on any device"). An XR-native reference system architecture (with standardized interfaces to external systems like ERP) and a common XR data format will be researched to maximize valorization potential by promoting interoperability. Project results will be validated in InfraFlex_INFRA and SmartFactory, and in real-world industrial applications defined in conjunction with user group members (e.g., authoring an XR model of a HMLV shopfloor that is synced with the physical setup to enable virtual monitoring, planning and validation of layout changes).
Period of project
01 September 2022 - 31 August 2026