Artificial Intelligence Initiative

Don Wiggins
Artificial Intelligence Initiative

My colleagues and I here at Equinix have recently witnessed a significant uptick in both interest and practical application of artificial intelligence (AI) solutions while leveraging Interconnection Oriented Architecture™ (IOA)™ best practices on our global interconnection and data center platform. Our observations continue to validate the inherent benefits of proximal adjacency to transact and intersect at the edge of networks, clouds, and subsequently applications and data. Unprecedented use cases continue to evolve where physics (the speed of light) once wreaked havoc on collaborative, yet physically disparate systems of record. As proximity lowers latency and enables the ability to exponentially scale throughput capacity for data transfer and ingest, physics is becoming a relative non-issue. Geo-strategic, regionally-distributed IT architectures enable next-generation applications and the technology that powers them to render real-time insights. This is arguably none more apparent than in the application of AI, where varied source data interdependencies enable an AI system to learn, reason, problem-solve, perceive and achieve measures of language understanding with a continuous stream of content – all essential elements of AI development.

A collaborative approach

As the world’s largest carrier-neutral interconnection platform, it’s equally important to note what is done “at” Equinix versus what is done “by” Equinix. Since our very beginning, we’ve had a laser focus on providing a predictive, secure and high-performance private interconnection alternative to the public internet, which can introduce unpredictable performance, reliability, security, etc. By launching a series of private, high-capacity peering platforms within our 200 International Business Exchange™ (IBX®) data centers across the globe, literally thousands of cloud, network and managed service providers can do their bidding here, with what results in a shared customer base from every vertical imaginable.

Having achieved critical mass from an interconnection perspective, the next logical progression is to facilitate ecosystem-driven marketplaces, where business and public sector entities alike can conduct private, secure, high-speed, low-latency transactions that ultimately create private and/or federated service brokerages. One of the more prolific digital edge services gaining exponential traction is within the data brokerage realm, and a key beneficiary of this trend is AI. Here are just a few telling examples of how ever-increasing volumes of data are now being leveraged to render incredibly important insights:

Data Source Daily Volume of Data Generated by Each Instance
Commercial Aircraft 5-8 Terabytes
Smart Car 4 Terabytes
Smart Hospital 3 Terabytes

These growing volumes of data generated can present some difficult challenges for traditional data archival and transfer approaches of the past. The data must be efficiently moved and ingested for processing with, as is often the case, other related yet disparate sources of data to render myriad insights ranging from the internet of things (IoT), healthcare, defense, financial, energy and many industries actively engaged in digital transformation.

Equinix’s global platform represents the ideal location to geo-strategically ingest, process and render insights on the data – as this is where thousands of platform vendors and service providers can readily provide governance, data brokerage, data fabric and distributed edge analytics platforms.

Our digital interconnection platform provides the ideal location to develop and curate a data marketplace with ready access to participating partners, clouds, networks, data brokers and third-party algorithms. Why are we placing such an emphasis on this initiative? Trending indicators reflect the following:

  • By 2019, 75% of analytics solutions will leverage ~10 external data sources[i]
  • By 2020, 90% of large enterprises will sell Data-as-a-Service[ii]
  • 300 data aggregators/brokers are vertically specialized across the spectrum

What’s also interesting is the synergistic approach that U.S. Government entities are taking with private industry on our platform to develop AI solutions to address mission-critical requirements often with national security implications. Many of these efforts center around digital edge analytics, where proximal adjacency to services and mission partners has become game changing in the development of next-generation applications. These partner-based public/private sector consortiums will be a major determining factor in the successful implementation of advanced technologies in this field of study. A collaborative approach driven by a February 2019 U.S. Federal Government Executive Order on Maintaining Leadership in Artificial Intelligence in its infrastructure and technology and the White House’s Select Committee on AI (SCAI) are indicators of the emphasis that’s been placed on AI development. The National Institute of Standards (NIST) is actively developing technical standards, tools and a broader framework to support this effort.

Key initiatives to enable and accelerate this effort include:

  • IT Modernization: The SCAI will provide technical expertise regarding AI and the modernization of federal technology, data and the delivery of digital services.
  • Better Data: Agencies will find ways to improve the quality, usability and access to data prioritized by the AI R&D community.
  • Barriers and Protections: the Office of Management and Budget (OMB) will review impediments to AI R&D, including data access barriers, quality limitations, privacy and security requirements, etc.
  • The Department of Defense Joint Intelligence Center (JAIC) is actively developing a comprehensive framework for developing and applying AI across a number of mission requirements:
    • Predictive maintenance applications for military equipment, increasing availability of combat systems and reducing operating cost.
    • Predictive analytics data models to better combat disaster events and improve rescue efforts during events.
    • Cyberspace event detection, network mapping and compromised accounts.
    • Robotic process automation that is focused on automating back-end tasks to free up people to do more complicated tasks.

As indicated earlier, Equinix has assumed the role of facilitator of these initiatives, and integral to that role is providing industry insights that we’ve gathered over an extended period of time from a vibrant and extensive ecosystem of partners.

An essential building block for developing an AI capability is the implementation of a data fabric – the following illustrates a reference architecture along with best practices to begin this process:

Digital Edge-Based Data Fabric Reference Architecture

The data fabric provides the underpinning for efficient data flow from the source to eventual analysis. The illustration above places an emphasis on a departure from siloed data repositories to a regionally distributed architecture that enables geo-strategic, rapid ingest of large volumes of data to edge-based analytics platforms.

Data Fabric Design Patterns

The design patterns illustrated above suggest a best practice approach to:

  • Distributed Data Repository

  • Data Cache & Edge Placement

 

  • Edge Analytics & Streaming Flows

 

  • Data Exchanges & Data Integration

 

  • Data Pipelines & Provenance

You can assess these design patterns from the IOA Knowledge Base.

Equinix will continue to remain heavily engaged in the curation of edge analytics and the subsequent digital marketplace where the secure and efficient exchange between data suppliers and consumers will continue to drive and accelerate AI initiatives across the spectrum.

You can learn more by reading the Federal Government Blueprint.

[i] UNICOM, “Data-Lake Day,” April 2019.

[ii] IDC, “FutureScape: Worldwide IT Industry 2018 Predictions,” October 2017.

 

Subscribe to the Equinix Blog