Equinix and NVIDIA AI LaunchPad Accelerate AI from Hybrid Cloud to Edge

End-to-end NVIDIA solution integrates with Equinix Fabric software-defined interconnection to launch instant AI infrastructure

Doron Hendel
Kaladhar Voruganti
Equinix and NVIDIA AI LaunchPad Accelerate AI from Hybrid Cloud to Edge

Artificial intelligence (AI) has become mainstream. According to IDC, “By 2025, at least 90% of new enterprise apps will embed AI; by 2024, over 50% of user interface interactions will use AI-enabled computer vision, speech, natural language processing, and AR/VR.”[i]

Service bots, fraud detection, retail checkouts and recommenders for online shopping are just a few of the innovations coming out of AI technology to automate repetitive actions. This is why we are excited that Equinix® will be the first digital infrastructure company to offer the NVIDIA AI LaunchPad’s “instant AI infrastructure” in our more than 220 interconnected data centers worldwide.

See how digital business connects with Equinix Fabric

Connect digital infrastructure and services on demand at software speed via secure, software-defined interconnection. Scale hybrid deployments, achieve network agility, directly connect to partners and providers easily and securely.

Read More
Screen Shot 2021-06-16 at 1.10.54 PM

Moving from centralized to distributed AI infrastructures

Currently, we are seeing customers needing to shift from a centralized AI model—where AI model training and inference take place at the same location—to a more distributed model—where these capabilities occur at distributed locations for performance, cost and privacy/compliance reasons.

The following trends are driving this move from a centralized to a distributed AI infrastructure:

  • More data is getting generated at the edge: In many use cases it becomes very expensive to backhaul large data sets to a central cloud for processing. Instead, it is more cost-effective to process this data at the edge, close to where the data is generated.
  • Data residency and compliance laws: Around 142 countries have data residency laws mandating that the personally identifiable (PII) data of their citizens cannot be taken out of these countries for processing.[i] Thus, it is necessary for an AI solution to locally process data and then aggregate the insights across these different locations in a federated manner. Increasingly, global enterprises need to have local AI stacks in multiple countries to comply with data residency requirements.
  • AI hardware is becoming denser: Increasingly, AI programs are tackling challenging problems that process more data and need more compute power. As a result, next-generation AI hardware stacks are becoming denser and more powerful with a higher power draw (>30KW per rack). Traditional private data centers are not equipped to handle these power and cooling needs. Thus, it is necessary to place this dense AI hardware in private cages at colocation data centers that can satisfy power and cooling requirements.
  • AI apps need external data sources: In many cases this data will span across public clouds, edge devices, private data centers and data brokers. For cost and performance reasons, it makes sense to host the AI solution stack at an interconnection hub that has secure and high-speed connections to these different data sources.
  • Low latency requirements: Applications such as autonomous vehicles, factory robotic automation and AR/VR-based predictive maintenance require very low latency performance. In many cases, these applications cannot tolerate the network latency associated with AI model inference requests to a central cloud. Thus, there is a need to perform AI inference operations at the edge.

As evident from the above trends, AI processing cannot occur solely in a centralized public cloud for cost, compliance/privacy, and performance reasons. Furthermore, in many instances, private data centers cannot host the next-generation GPU-based AI hardware due to their higher power and special cooling requirements. Thus, there is a need for an end-to-end AI solution that makes it easy for enterprises to train and inference their models across distributed locations spanning from centralized clouds to different types of edge locations.

NVIDIA AI LaunchPad on Platform Equinix is accelerating AI from hybrid cloud to the edge

Equinix is enabling the NVIDIA AI LaunchPad solution on Platform Equinix® to deliver instant AI infrastructure for enterprises. This is an end-to-end solution that provides both AI core infrastructure for model training with NVIDIA DGX systems, as well as inference and AI edge infrastructure with NVIDIA Certified Systems built on the NVIDIA EGX platform. Equinix Fabric provides high-speed and secure connectivity between these distributed training and inference locations.

In addition to providing the AI compute, network and storage infrastructure, NVIDIA AI LaunchPad provides the necessary software-based orchestration services to move data and AI models between the distributed sites in a seamless manner using cloud technologies. Customers can manage their AI development workflow with NVIDIA Base Command, and NVIDIA Fleet Command, which provides easy, secure management and deployment of AI at the edge. The Equinix infrastructure deploys in minutes, providing enterprises with immediate access to an entire spectrum of NVIDIA resources that support virtually every aspect of AI, from data center training and inference to full-scale deployment at the edge.

NVIDIA AI LaunchPad on Platform Equinix with Equinix Fabric

How Equinix is providing fast and secure interconnection to distributed data sources

Equinix Fabric™ software-defined interconnection services provide fast and secure data transfer from distributed data sources to the NVIDIA AI model training stack. The same private interconnection solution also enables the transfer of the newly developed AI models to the NVIDIA AI edge infrastructure at Equinix. Enterprises can deploy their AI training and edge infrastructure on Platform Equinix in 63 metro markets across 26 countries on five continents. All of these distributed Equinix sites are interconnected via Equinix Fabric high-speed, low-latency and secure virtual connections. Equinix delivers global organizations a consistent and reliable data center and interconnection platform experience, without having to deal with multiple data center processes from different vendors across the globe.

As the world’s digital infrastructure company, Equinix provides a digital ecosystem where more than 10,000 businesses, including cloud and network service providers and enterprises, conduct business today. Equinix Fabric provides connectivity between all these organizations spanning across the globe. As a result, deploying an AI stack at Equinix provides enterprises with high-speed and secure access to data from these different organizations.

The strategic location of Equinix International Business Exchange™ (IBX®) data centers is important for moving data to the AI stacks from different locations with very low latency, which satisfies the needs of real-time applications. Finally, Equinix IBX data centers have been verified by NVIDIA to meet the power and cooling requirements of next-generation AI hardware.

NVIDIA CEO Jensen Huang said yesterday at Equinix Analyst Day, “Companies have obstacles like needing computing close to the data to react quickly or keeping data transit costs down, or because of data privacy, lack of data ownership, or sovereignty concerns. With Equinix, we are able to deliver a turnkey, state-of-the-art, edge-to-cloud AI computing infrastructure that will integrate seamlessly into our customers’ own infrastructure, regardless of where that is.”

The next steps toward an end-to-end AI infrastructure

Equinix and NVIDIA are building an AI ecosystem of technology providers, ISVs, tool developers, data brokers, network providers, to name a few, to democratize AI.

We’re combining the power of NVIDIA AI with the global reach and ecosystem access of Equinix to unlock the power of AI for the masses. With NVIDIA AI LaunchPad and our partnership with Equinix, the world’s digital infrastructure company, we are democratizing and launching AI for the world’s enterprises.” Jensen Huang, CEO, NVIDIA

NVIDIA AI LaunchPad combined with Equinix services provides a complete development-to-deployment AI infrastructure. The AI LaunchPad platform is available for consumption on a subscription basis. AI LaunchPad users can choose to scale their AI infrastructure at Equinix enabled by NVIDIA as they scale their business around the world while leveraging the best AI hardware and software infrastructure and ease of use.

Learn more about how Equinix Fabric can directly and securely connect your digital infrastructure on Platform Equinix.

You may also want to read: Artificial Intelligence: From the Public Cloud to the Device Edge

 

 

[1]  IDC FutureScape: Worldwide IT Industry 2020 Predictions, October 2019, Doc # US45599219.

[1] “2020 Ends a Decade of 62 New Data Privacy Laws,” By Graham Greenleaf and Bertil Cottier, January 29, 2020.

アバター画像
Doron Hendel Former Global Business Development
アバター画像
Kaladhar Voruganti Senior Business Technologist
Subscribe to the Equinix Blog