Artificial intelligence (AI) has been around since the 1950s, but it has recently grown into a real force in the digital transformation of businesses and our personal lives. From digital twins that create “living” digital simulation models of physical things (e.g., avatars for airplanes, cars, etc.) to voice recognition applications in smartphones and home assistants, machines are busy learning all there is to know about us and our world. This expanding integration of AI in B2B and B2C products and services will increase data usage exponentially.
According to PwC research, AI is projected to be the biggest commercial opportunity in today’s fast-changing global digital economy. PwC predicts the global gross domestic product will be 14% higher in 2030 as a result of AI – the equivalent of an additional $15.7 trillion.[i]
So what happened to give AI technology such a growth spurt? The answer lies in the confluence of the following reasons that are making AI very real:
1) Big data and real-time analytics.
2) Access to scalable hardware in the cloud in an elastic manner.
3) Innovation driven in the AI/ML algorithm space, due to an open source model.
With so much data being generated from individuals (average internet user = 1.5 GB/day), organizations (average smart hospital = 3,000 GB/day) and things (average connected car = 4,000 GB/day), AI algorithms now have access to enough data to build accurate models. Furthermore, clouds have made it cost effective to crunch through these large data sets and they have also made it easy to have access to the latest AI innovation by making open source AI libraries easily available. Thus, AI/ML algorithms and techniques (e.g., neural networks), which were considered to be too slow or inaccurate, have now gained new life (e.g., now called deep learning) and are more accurate than humans at tasks such as image and speech recognition.
Bring Together Your Digital Infrastructure
Platform Equinix is where digital leaders bring together all the right places, partners and possibilities they need to succeed.
Learn MoreThree distributed AI architecture trends
While first-generation AI architectures have historically been centralized (i.e., both model building and use are done in a centralized cloud), at Equinix we see three trends that will accelerate distributed enterprise AI architectures:
Trend #1: Centralized to Distributed AI Architectures
AI architectures will be distributed and enterprise AI model building and model inferencing will take place at local edge locations, physically closer to the multiple sources that are producing huge volumes of data. For performance, security and cost reasons, in many cases, it does not make sense to move massive data sets from the edge to a central location. Let’s look at a typical AI modeling use case to better illustrate these issues.
In the case of an airplane, the ability to detect faults prior to a catastrophic event is a high-stakes, potentially lifesaving priority. The following airline AI model for anomaly detection demonstrates why distributed AI architectures are so critical:
- There may be four to six terabytes of data per plane, per day that need to go to one centralized place for modeling. To build an accurate model, you have to move large datasets from multiple planes to a central location. Moving six terabytes will take a couple of hours on the fastest available networks.
- Now, instead of moving these massive datasets, if one could build a local (potentially sub-optimal model) at the edge, and then move this local model (in the order of tens of gigabytes) to the central location and build a global more accurate model based on these local models, it will be both cost-effective and much faster.
- Once you have built the final global model, you can then ship it back to each of the local locations more quickly and with less cost, enabling every location to have the same information and leverage the valuable insights from a comprehensive model in real-time that has integrated all of the modeling knowledge from multiple sources.
This can all happen by having AI model building tasks be done at edge locations that are close to the end devices and by having distributed data fabric networks spanning between the edge and central locations that provide single name space for easy policy management.
Trend #2: AI Models Built Using Multiple External Data Sources
Gartner has estimated that a typical large-scale AI application on average uses around 10 external data sources to improve the accuracy of the model. Typically, companies procure data from external sources (e.g., weather data, traffic data, marketing data, etc.) via bilateral agreements and in an ad-hoc, cumbersome manner. Furthermore, according to IDC, 90% of large enterprises will sell data as a service from the sale of raw data, derived metrics, insights and recommendations by 2020 – up from nearly 50% in 2017.[ii]
Thus, we are entering the era of AI/ML marketplaces where data and algorithms will be bought and sold by different companies. However, many of these enterprises are afraid to monetize their data because they are worried that the buyer of their data will use their data in an unauthorized manner, even when legal contracts are in place. Thus, one of the key requirements is that many of these companies want to conduct their transactions at a cloud neutral location where participants will bring both data and algorithms that will operate on the data in Docker containers. The marketplace will, in turn, provide services to validate both the data and algorithm content.
This marketplace platform should be in a neutral “demilitarized zone” where you cannot take the raw data out, but rather use secure APIs and validated AI algorithms to leverage the data intelligence without removing data from the marketplace. This allows businesses to monetize valuable data/algorithms without compromising their content in the process.
Trend #3: AI Architectures Will Leverage Public Cloud Innovation
The public cloud is demonstrating a much faster rate of innovation than any other IT infrastructure to date. There is easy access to seemingly infinite compute and storage. Clouds are democratizing AI by providing access to open source AI libraries and by also providing help with building customized AI models based on user data. In some ways, the clouds are enticing customers to bring their data to the cloud by providing these AI services.
Practically every global hyperscale cloud provider (e.g., AWS, Google, IBM, Microsoft, Alibaba) is spending up to billions of dollars in AI investment to develop their AI eco-systems. According to Bernie Trudel, chairman of the Asia Cloud Computing Association, “The cloud-based AI market will see a tenfold year-to-year growth rate, increasing its share in the global cloud computing market from 1% to 10% by 2025.”[iii]
However, enterprises are looking at leveraging AI innovation from multiple public clouds without getting locked into a single cloud, and they are looking at hybrid/multi-cloud solutions to keep their options open and retain some management control.
In particular, they are:
1) Hesitant to put their sensitive data in the cloud.
2) Want to leverage innovation from multiple cloud providers.
3) Sensitive about data egress costs in the cloud.
All of this is driving enterprises to seek multicloud AI architectures where data – the stateful part of their application – resides in their private control at a neutral data center location that has high bandwidth connectivity to the multiple clouds.
Equinix is helping enterprises build these hybrid/multicloud AI architectures by providing a platform that provides services such as:
1) Equinix Cloud Exchange Fabric (ECX Fabric) enables businesses to reliably and consistently move data across multiple locations and hybrid/multicloud infrastructures in a high performance and secure manner.
2) Equinix SmartKey™ allows customers to encrypt their data and compute containers across multiple clouds while storing their keys at a neutral location.
3) A global platform of 200 Equinix International Business Exchange™ (IBX™) data centers that are typically around 1 to 2 milliseconds away from the hyperscaler data centers with 10 to 100G connectivity to many leading cloud providers.
In conclusion, Equinix sees artificial intelligence technology as a tremendous growth opportunity for our customers in all industries, and we are poised and ready to help them develop distributed, high-performance, scalable and secure hybrid/multicloud AI architectures.
Learn more about Platform Equinix and how you can leverage private interconnection to develop distributed AI architectures that reach everywhere, interconnect everyone and integrate everything.
Be sure to check out our other Predictions 2019 blog posts:
- 5 IT Predictions for Digital Business in 2019
- Part 1: Paving a Path to the Promise of 5G
- Part 2: Riding the Rise of Distributed Artificial Intelligence Architectures
- Part 3: Un-blocking the Chain
- Part 4: Maneuvering the Data Privacy Maze
- Part 5: Tapping Interconnection to Tame Cloud Complexity
[i]PwC, “AI to drive GDP gains of $15.7 trillion with productivity, personalisation improvements,” 2017.
[ii]IDC, “FutureScape: Worldwide IT Industry 2018 Predictions,” 2017.