Enabling Legacy Lab-Scale Production Systems: A Digital Twin Approach at Széchenyi István University
The burgeoning importance of digitalization and cyber-physical manufacturing systems in the industrial sector is undeniable, yet discussions around viable solutions for small and medium-sized enterprises remain scant. These enterprises often face constraints in replacing extant machinery or implementing extensive IT upgrades, despite the availability of skilled engineering personnel. In response to this gap, an illustrative use case involving the application of Digital Twins (DT) to legacy systems is delineated, encompassing a detailed exploration of necessary hardware and software components, alongside pertinent considerations for implementation design. The establishment of a symbiotic relationship between the physical and digital realms is underscored as imperative, necessitating a granular understanding of the system to uncover opportunities and constraints for intervention. Such understanding is posited as a critical determinant of the DT's utility. This case study, situated within the Cyber-Physical Manufacturing Systems Laboratory at Széchenyi István University, serves to elucidate these principles and contribute to the discourse on smart manufacturing solutions for legacy systems.
Various interpretations of the Industry 4.0 initiative have been proliferated, encompassing diverse concepts. Nonetheless, a consensus has been reached, acknowledging digitalization as a fundamental cornerstone of this initiative. The advent of digitalization and cyber-physical systems has elevated the significance of Digital Twins, as evidenced by numerous scholarly articles and publications examining their potential in manufacturing and intralogistics. National and international strategic roadmaps have underscored this prominence, incorporating industrial and scientific research directions , .
A DT is conceptualized as an IT system, generating models that harbor identical information across both real and virtual domains , . Conventional methodologies traditionally do not entail the testing of designed control logic on the actual production system prior to implementation; such testing is conventionally relegated to the completed and assembled production cells and lines. In contrast, the deployment of a pre-constructed simulation model permits parallel testing of control logic, extendable to the conceptual design phase. Virtual commissioning, facilitated by this approach, serves to diminish the duration of physical commissioning and reduce the requisite engineer labor in the field.
Critically, an accurate simulation model, or DT, is not limited to initial use; it is adaptable and applicable throughout the lifespan of the actual system, providing a platform for the evaluation of future modifications. In doing so, it enhances productivity across the engineering chain and supports the operationalization of manufacturing systems by systems and service applications , . Despite the discourse surrounding DTs in the context of new systems, discussions regarding their application to existing legacy systems remain sparse. This text delineates the process of constructing a DT for a legacy system, with a particular emphasis on retrofitting a device to industry standards, encompassing multiple systems.
2. Literature Review
The initial terminological articulation of the subject domain was rendered by Grieves  in 2005, coining the term Mirrored Space Model (MSM), subsequently followed by NASA in 2010, which introduced the term DT and provided its inaugural definition , . The trajectory of scientific development has transcended the confines of the aerospace sector, aspiring to cultivate an intelligent manufacturing environment, now encompassing an array of new technologies, applications, and methodologies in machine learning. Within the corpus of literature, the functions of DT in the context of the cyber-physical system, along with its multifarious applications, have been categorically distilled into three primary domains: monitoring, life cycle analysis, and decision-making , .
In preceding applications, a clear propensity for supporting the operational facets of the physical system was exhibited . In the 2021 compendium Digital Twin Concept, Technology, Industry Application Summary by Liu et al. , a comprehensive endorsement of these application modalities is provided, with an analytical distribution of their prevalence in extant literature. The categorization, as formulated by the authors, encompasses concept, technology, paradigm and framework, and application, with the latter category eclipsing the cumulative total of the preceding three in terms of volume. Furthermore, the classification of DT solutions according to their lifecycle stages - Design, Manufacturing, Service, Retire, and Full Lifecycle - reveals a skewed focus in the literature. A preponderance of solutions are concentrated on singular lifecycle phases, with a mere 5\% contemplating the entire spectrum of the lifecycle. The phases of production and service are particularly prominent in this context .
In 2018, a classification of the digital twin into three distinct subcategories was proposed by Kitzinger et al. , grounded explicitly on the nature of the communicative link between the physical entity and its digital counterpart. These categories are identified as the digital model, the digital shadow, and the DT. The digital model is defined as a digital representation of either an extant or a prospective physical object, and it is crucial to note that its definition does not necessitate an automatic data exchange between the physical and digital realms. Conversely, the digital shadow is conceptualized as a virtual depiction of the system, operating based on data derived from the physical system. However, its interaction is limited to observation, lacking the capability to influence the physical processes, thus constituting a unidirectional form of communication. This category is often erroneously conflated with the DT. The DT, in contrast, is characterized by a bidirectional flow of data, enabling not only the monitoring but also the manipulation of the physical system’s operations. Such interventions are contingent upon processes within the virtual domain or may be predicated on historical data. Nonetheless, these interventions must be meticulously strategized, ensuring alignment with safety protocols to safeguard both the personnel involved and the operational integrity of the system.
Simulations associated with the DT are employed to prognosticate the potential performance of its physical counterpart in real-world scenarios . This stands in contrast to the design process, which traditionally relies upon idealized or perceived worst-case scenarios. By juxtaposing actual system performance data with that of the DT, informed decisions can be made, contributing to successful outcomes. Furthermore, the integration of data from the physical twin into the digital twin facilitates the refinement of system models, subsequently enabling the utilization of DT analyses to enhance the real-world performance of the physical system.
The landscape of research in the realm of DT technology has been enriched by numerous scholars, each contributing insights into its potential and diverse methodologies. In 2018, Padovano et al.  elucidated a dynamic system integrating “virtual eyes and hands” operators within a physical environment. This system is characterized by the generation and consumption of services to facilitate interoperability, incorporating heterogeneous and remotely accessible web services, in conjunction with middleware facilitating interactions with extant systems, including SCADA (Supervisory Control and Data Acquisition) and ERP (Enterprise Resource Planning).
The integration of a DT model of manufacturing systems into a decision support system, with particular emphasis on its utility in order management processes, was expounded upon by Kunath and Winkler in 2018 . In a subsequent study conducted in 2020, Jeon and Schuesslbauer  delineated the implementation of an architecture encapsulating four critical phases of production: design, operation, optimization, validation, and implementation. This process entails the creation of a model to simulate the physical system’s behavior, subsequently establishing bidirectional communication between the real world (PLC) and the virtual model.
A semi-automated methodology for DT creation of industrial processes was proposed by Sierla et al. , involving the extraction of information from diagrams, its conversion into a graphical format for simulation model creation, and the subsequent configuration and parameterization of the simulation model in accordance with process data to yield a DT. The optimization of a dynamic scheduler for smart manufacturing through a DT of a manufacturing cell was explored by Xia et al. , wherein a smart scheduling agent, termed a digital engine, was developed and honed using deep reinforcement learning (DRL) algorithms.
A reference model predicated on a surface model shape was proffered by Schleich et al.  in 2017 as a physical product twin for design and manufacturing applications. This model was comprehensively defined, spanning concepts, representation, implementation, and applications throughout the product lifecycle. A comparative analysis of the Industrial Internet Consortium (IIC) digital twin standard and prevalent DT creation methodologies was undertaken by Konstantinov et al. , culminating in an evaluation of extant software solutions for DT creation, based on requisite characteristics for their instantiation.
3. Bridging Digital and Physical Dimensions
At the Cyber-Physical Manufacturing Systems Laboratory within Széchenyi István University, a case study was conducted, employing an automated training manufacturing cell in conjunction with Visual Components (VC) software, as depicted in Figure 1. The undertaking was oriented towards the alignment of automated physical processes with DTs, thereby facilitating interventions in physical operations while simultaneously enabling real-time data manipulation.
The proposed methodological approach is grounded in utilizing readily accessible information as the foundational basis for DT design processes, particularly pertinent for small and medium-sized enterprises operating within the EU economic territory. It is mandated to commence with information requisite for machinery and semi-finished machinery, subject to the machinery directive , ensuring compliance for market placement and operational feasibility. Essential information pertaining to the physical production system encompasses:
• Computer-Aided Design (CAD) drawings;
• Electrical, pneumatic, and hydraulic drawings (omission of non-applicable systems is permitted);
• PLC program operating principle.
In the initial phase of DT development, the imperative task is the generation of a 3D model of the system. This is achieved through the importation of existing models into VCs, as illustrated in Figure 2.
Following the importation of CAD schematics, it becomes imperative that the model is parameterized, aligning with the operational logic inherent to the physical system for each component, whether moving or sensing. This necessitates an in-depth comprehension of the automatic production cell's functionality. Developed subsequently was the legacy system, having interactions and responses to signals from the PLC incorporated. It is acknowledged that the PLC, foundational to this use-case, lacks compatibility with contemporary, industry-standard communication protocols requisite for DT design, as delineated in references , . Upgrading legacy systems is recognized as a financially intensive endeavor. It is highlighted that the associated costs extend beyond the mere acquisition of new PLC controllers, encompassing potential productivity losses within an operational system and substantial engineering hours required for migration between controllers. As a cost-effective alternative, the integration of a new PLC unit supportive of OPC-UA is proposed, facilitating data management across multiple manufacturing systems under disparate control. This unit functions as a quasi-hub, interfacing with various PLCs. Consequently, data pertinent to different PLCs are centralized in the “data collector” PLC, ensuring seamless communication with the model established in Visual Components. Data exchange between the production cell and the digital twin transpires through the OPC-UA protocol, with the PLC and VCs serving as the OPC-UA server and client, respectively. During these interactions, the server avails all sensor data and pre-defined system description parameters, essential for real-time operational functionality. It is pertinent to note that when the virtual system’s operation is contingent solely upon server data, the term “digital shadow” is applicable, given the absence of bidirectional data transmission, as illustrated in Figure 2.
In pursuit of a cost-effective solution, as previously delineated, the proposed system architecture underwent rigorous evaluation with a focus on data transmission efficacy. An entry-level product from a prominent PLC manufacturer, inherently equipped with native OPC-UA support, was selected for these trials. During the testing phase, manufacturing systems were consolidated under a singular controller to assess the resultant workload. Connections between the controller and the PC executing the VC software were established via Ethernet. Two prevalent data types were employed: the boolean (1-bit) type, commonly associated with digital I/Os, and the integer (16-bit) type.
Experiments were meticulously conducted, not with the intention of quantifying the transferable data volume, but to scrutinize the impact of data publication to clients, such as the VC model, on DT functionality. Figure 3 presents an analysis of client-side data loss observed when 500 signals were concurrently altered across varying cycle times. For each measurement setting, the test program was executed ten times, with the resultant averages subsequently plotted.
Findings from these experiments indicate that even with an entry-level PLC, it is feasible to aggregate and transmit data from numerous production cells and lines to the DT, thereby ensuring synchronization between physical and digital entities. Attention must, however, be directed towards the careful selection of publication intervals, particularly in scenarios involving simultaneous data alterations. Under the tested PLC configuration, it was observed that with 500 simultaneous data changes, the data exchange frequency from the PLC to the PC could be reduced to 700 ms without incurring data loss.
For the rationalization of bidirectional data exchange, it becomes imperative to delineate the intervention points through which the digital entity could potentially influence the physical system. In the establishment of such intervention points, paramount importance is attributed to adhering to health and safety regulations whilst ensuring the sustained operational integrity of the system.
In instances where an external intervention in an automated system, governed by a PLC, is envisaged, the PLC’s response to such interventions necessitates careful consideration. Thus, a critical task that emerges is the synchronization of control signals and the preparation of the system for the assimilation of commands emanating from the DT.
A particular case in point, as illustrated in Figure 4, is the “pick & place” workstation within the automated system. The initiation of movement at this juncture is contingent upon the concurrent presence of both the product and the component designated for installation. This prerequisite set of conditions finds its complement in a digital process. The PLC, in this context, is programmed to initiate the “pick & place” action solely when all stipulated conditions are met, inclusive of those resultant from digital processes. These digital processes might manifest as either simple or complex timers, with the latter scenario even accommodating the integration of a virtual workstation. This inclusion not only augments but also streamlines the developmental trajectory of production cells and the subsequent integration of workstations into the overarching process.
In the course of the conducted experiments, several pertinent observations were made regarding the establishment of DT in direct interaction with the PLC governing the production system, providing valuable insights for enhancing the application efficiency, even among industrial collaborators.
It was noted that each sensor and actuator exhibits rapid sensing and response characteristics. The occurrence of multiple state changes in an input or output within the publication interval of the OPC-UA server, hosted on the PLC, poses a substantial risk of data loss or information discrepancy on the digital side. Instances were identified, such as short pulse signals, wherein a solitary trigger signal from the PLC to the actuator results in an immediate reset of the actuator state to its original position. Consequently, the manufacturing unit remains unresponsive during these brief milliseconds on the digital side, owing to the non-reception of this data.
A further implication of this software and communication configuration is the observed discrepancy in the sequence of various signal changes. Given the high frequency of diverse signal changes within a single publication interval, facilitated by blocked, packet-by-packet data transmission, the chronological sequence of these changes on the digital side may not consistently mirror their occurrence in the physical system. This phenomenon raises concerns regarding the reliability of the model’s functionality, even when parameterized and programmed using PLC control logic within the digital space.
In light of these challenges, the design of a resilient digital twin, informed by an understanding of these potential disruptions, becomes imperative. The integration of synchronization points and algorithms into the model constitutes a critical step in this mitigation strategy. Influencing factors in the physical system, such as the friction coefficient between workpieces and the conveyor belt, or the precise parameters of acceleration and deceleration of the belts, are often overlooked in the digital counterpart. This discrepancy can lead to deviations between the DT and the actual system.
To ameliorate these inconsistencies, the implementation of synchronization points in the physical system, at junctures where the position of the workpiece is definitively known (e.g., a presence sensor signal), is proposed. This approach ensures alignment between the systems; if the workpiece precedes its digital counterpart to a synchronization point in the physical system, its position is correspondingly adjusted in the digital space. Conversely, if the workpiece reaches the synchronization point in the digital space ahead of its physical counterpart, it is programmed to await physical detection. This enhancement in synchronization, particularly when applied to remote monitoring and visualization, holds significant potential for augmenting the efficiency of the DT.
In the presented case study, the implementation of DT, characterized by bi-directional communication via a near real-time standard protocol within a software environment, has been elucidated. This digital twin possesses the capability to enact real-world interventions in the physical system based on decisions formulated at the digital level. The establishment of two-way communication engenders a genuine digital twin, wherein the virtual system transcends monitoring and mapping of processes, extending its functionality to actively influencing the operation of the physical system.
It is imperative to emphasize that the operationalization of a DT necessitates a holistic vision and profound understanding of the entire system, encompassing both the virtual processes and the requisite preparation of the physical system for such intricate interactions. The investigation has explored an alternative communication methodology to circumvent the limitations inherent to legacy PLCs governing antiquated systems. It has been demonstrated through measurements that data aggregation from multiple automation cells or production lines is feasible via a modern PLC, subsequently facilitating data transmission to the DT software.
A critical consideration highlighted pertains to the potential pitfalls associated with an OPC-UA server operating locally on a PLC. The issue of blocked data transmission has been identified, resulting in potential discrepancies in signal order between the real and virtual systems. This necessitates careful attention during the design phase, particularly when interventions in the real system are anticipated. Additionally, the volume of simultaneously changing data has been recognized as a factor influencing the real-time correlation between the virtual and physical systems.
Conceptualization, G.D.M., N.S., R.K., S.K.S. and S.F.; methodology, G.D.M., N.S., R.K., S.K.S. and S.F.; software, G.D.M., N.S., R.K., S.K.S. and S.F.; validation, G.D.M., N.S., R.K., S.K.S. and S.F.; formal analysis, G.D.M., N.S., R.K., S.K.S. and S.F.; investigation, G.D.M., N.S., R.K., S.K.S. and S.F.; resources, G.D.M., N.S., R.K., S.K.S. and S.F.; data curation, G.D.M., N.S., R.K., S.K.S. and S.F.; writing—original draft preparation, G.D.M., N.S., R.K., S.K.S. and S.F.; writing—review and editing, G.D.M., N.S., R.K., S.K.S. and S.F.; visualization, G.D.M., N.S., R.K., S.K.S. and S.F.; supervision, S.K.S. and S.F.; project administration, G.D.M., N.S., R.K., S.K.S. and S.F.; funding acquisition, G.D.M., N.S., R.K., S.K.S. and S.F. All authors have read and agreed to the published version of the manuscript.
The data used to support the research findings are available from the corresponding author upon request.
This work was technically supported by the Digital Development Center of Széchenyi István University and the research team “SZE-RAIL”.
The authors declare no conflict of interest.