Digital twins are having a moment.
“Everyone’s talking about digital twin now,” James said. “It’s almost like an echo chamber. And if you look at the Gartner hype cycle, digital twin is right at the peak.”
And the reason everyone’s talking about digital twin? Because it lies at the convergence of economics and technology, James believes. Plus, the required computer software, hardware, and means to capture and transmit the data—all elements required for digital twin models--are becoming more available and affordable. At the same time, companies face ever-present competition to be more efficient economically.
These forces are now converging, making digital twin methodology viable. Engineering simulation product development methodologies have been used for decades to speed time to market, optimize design, and reduce costs—and the methodologies are now being embraced more broadly to optimize operations as well.
So, what’s a digital twin anyway? Well…
“No one has the same definition of a digital twin,” James said. But briefly, he explained that a digital twin allows engineers to communicate with sensors on a product to gather information, creating a digital “twin” that can be monitored and assessed in real-time. A key advantage of a digital twin is the ability to run continuous monitoring and diagnostics, allowing for key benefits like predictive maintenance. Digital twin can also help companies proactively plan for manufacturing processes and factory operations, minimize unplanned downtime, and even predict equipment failure. Knowing when equipment needs to be replaced or maintained allows for more carefully planned operations.
In James’s view, digital twin methodology is more useful for meeting economic goals than it is for specific technology end goals, especially for complex projects and systems. “Remote modeling of sensors has been around forever and is really old news. A digital twin gives you far more complex reliability insight within complex systems, like you need with a wind turbine with lots of components and multiple physics driving reliability and performance.”
James explained that a digital twin has several core components: sensors, data analytics and controls, a communication platform (such as a cloud), and decision-making physics. A digital twin can compile a massive amount of sensor data across industries, such as pattern recognition, fluid flow, and motion/velocity/displacement.
A key building block in digital twin methodology is reduced-order modeling (ROM), which reduces and simplifies large-scale full-fidelity models into essential inputs and outputs. “Essentially, ROM boils down the essence of a complex system into something simple—something that is typically a neutral format,” James explained.
For example, a heat exchanger unit may be one component in a rather complex system. While the actual workings of the heat exchanger may be complex in and of themselves, the only parameters provided by the ROM are the input and output temperatures. No other engineering data, especially proprietary material properties and CAD geometry data, is provided. A systems-level model can integrate and utilize thousands of these ROMs at a time.
Ultimately, Rick believes that the standard for engineering and design product development deliverables will include an ROM file. Some equipment manufacturers already provide CAD drawings (files) so that their equipment, at least the outer dimensions, can be placed in a customer’s CAD drawing, say a plant layout. The next evolution in the process is providing a ROM file.
James believes companies should hit the proverbial ground running and consider digital twin methodologies now, not later. “Digital twins are 2-5 years away from hitting their stride for true productivity,” James explained, “but it takes a while for technologies to work their way into companies so that they’re productive and valuable.”