We inside the technology industry do love our terminology. Think about it – does the average person outside of the tech industry really know what we mean by human augmentation, mesh networks, blockchain, intelligent assistants and the like? No wonder there are entire dictionaries devoted to tech.i Our own “How to Speak Like a Data Center Geek” series, which aims to clarify the terms that data center geeks are prone to toss around, has highlighted nearly 20 technologies that are defining digital business. Since the internet of things (IoT), is a major catalyst for many businesses’ digital transformation, we thought it would be a good time to explore a term it’s often paired with – digital twins.
Digital Twins – What they are
It’s easy to guess what a digital twin might be – literally a digital version, or “twin”, of a real thing, most often resident in a cloud platform. But it’s more than just a blueprint, scanned image or 3D model of a thing. A digital twin is a virtual representation of a physical thing (product, process, system, people, places, etc.), inclusive of all the digitized elements and dynamics of how that thing works and operates, plus how it evolves over time. Digital twins bridge the physical and virtual world, transmitting data between the two for the purpose of monitoring and testing around the physical object without needing to be close to the object.According to Deloitte, “Digital twins are accelerating product and process development, optimizing performance, and enabling predictive maintenance.”ii
Deloitte sees the global market for digital twins growing 38% annually to reach $16 billion by 2023. And the technologies that enable digital twins such as IoT and machine learning are each expected to almost double by 2020.ii
Types of digital twins
There are various ways to categorize digital twins – by use case, hierarchy or benefit – but the framework below provides a good general classification that works for most applications.ii,iii
- Product digital twin (design): Realistic digital models can help product designers prototype new ideas quickly, collaborate with user groups and test a variety of what-if scenarios to validate how the product will perform in the physical world.
- Production/operational digital twin (process): A virtual model of processes enables organizations to validate how well a process or process change will work before it’s deployed and, once deployed, identify inefficiencies and ways to address them. Applications outside of production factories include supply chain, healthcare, smart cities and more.
- Performance digital twin (predictive): Digital twins paired to a person, place or thing capture real-time data on usage, performance, environmental and operating conditions and more. Combining this data with artificial intelligence (AI) and machine learning (ML) algorithms enables organizations to predict when maintenance is required, gain efficiencies, improve digital twin models, solve problems faster and make more informed decisions.
- Remote digital twin (control): It’s debatable whether this is a separate category since digital twins, by definition, do not require proximity. But we’ll include it with the differentiating factor being human actor involvement. Examples include a doctor performing surgery or an engineer making repairs from afar via a digital twin model.
Digital Twins – How they work
A digital twin typically begins its life as a physics or mathematical model of an asset. Once the initial digital twin is generated, it leverages data from sensors installed on the asset to monitor real-time performance, operating conditions and changes over time. This data is sent through a cloud-based system where it is then analyzed. AI and ML algorithms enable the digital twin to learn and keep track of changes over time – essentially becoming a living digital model of the asset. It may also leverage data from a variety of other sources: human experts, industry and domain knowledge, similar assets and information from a larger ecosystem. An example will help to illustrate:
A 3D digital twin of a patient’s heart is created from data from thousands of other heart patients, scans of her heart and sensors attached to her pacemaker. AI/ML algorithms in the digital twin personalize the model to the characteristics of her heart and body. Her surgery is a tricky one, so the digital twin will enable her surgeon to plan and practice her surgery beforehand. The additional level of precision also means he can use a less invasive procedure. During surgery, the digital twin provides real-time 3D insight to the surgeon as he performs the surgery from 3,000 miles away. The surgery is a success, and the digital twin is later used in a classroom of premed students, so they can practice diagnosis and surgery for heart disease patients.
Digital twins and interconnection
Digital twins can vary in complexity from a single component in a product to an entire city, but they all have one thing in common – they evolve over time as the physical asset they are tied to changes. Tracking those changes accurately requires proximate interconnection between the asset and people, clouds, applications, AI and ML systems, and analytics the digital twin depends on. Direct and secure interconnection, at the digital edge, where digital twin data is being created and consumed, will deliver deeper insights more reliably, efficiently and cost-effectively. Here’s why interconnection is so important:
- WHERE does the data come from? It will typically come from multiple domains and sources. You won’t be collecting it from a single asset over a single network in one location. Rather, you’ll need access to an ecosystem of multiple networks to collect data from different sources.
- HOW does the data get processed? In some cases, the amount of data that needs to be transmitted to the cloud from sensors and other data sources for analysis by the AI and ML algorithms will be massive. You will need to be able to interconnect all the counterparties involved in the value chain around your data. Typically that means access to multiple clouds to store and process the data, requiring direct and secure hybrid multicloud interconnection.
- WHEN does the data need to be collected? Is it time sensitive? Some data may be collected less frequently for long-cycle analytics (like improving the model over time or for the next prototype), while some of it will need to be real-time (like your engine is overheating, you need to stop soon) and some of it depends on an action (like I pressed a button, so something should happen). Real-time data collection requires fast, low-latency interconnection to ensure that data is delivered quickly and reliably, before its usefulness expires.
- WHO is using the data or are there other systems that depend on it? This requires interconnection between the twin repository and the consumers of the data (users, systems or ecosystems of partners) that need to take action on the data.
Answering these questions boils down to simple physics. Because there are so many variables, digital twins need to have a dynamic, global interconnection platform that supports real-time interactions between people, things, locations, clouds and data. This means placing IT infrastructures close to where digital twin data is being generated, stored and processed, at the digital edge.
Download Equinix’s IOA Playbook to get the full scoop on how leveraging Interconnection Oriented Architecture™ (IOA™) best practices make this happen.
And don’t be shy about checking out every post in the “Speak Like a Data Center Geek” series – you never know what you might learn!