How to Create Cloud-Based Data Architectures

Leverage artificial intelligence and machine learning for greater value

Data, analytics and artificial intelligence (AI)/machine learning (ML) have become a sustainable competitive advantage used to transform today’s enterprises into digital businesses. Data architectures are rapidly evolving in ways that are transforming all industries and motivating enterprises to consider new approaches to capitalize on their data assets. Many cloud providers seem to be in an arms race, building out AI/ML platforms that enterprises can readily integrate into their data architectures. Building these data architectures on Platform Equinix™ not only allows you to leverage new data strategies to create modern data architectures, but you can also harness an interconnection-first strategy to adopt competitive hybrid and multicloud cloud capabilities.

According to LogicMonitor’s Cloud Vision 2020: The Future of the Cloud Study, 66% of the enterprises surveyed see AI/ML as one of the leading factors driving greater public cloud adoption in 2020. A modern data architecture is imperative in today’s global digital economy to give enterprises the speed and agility required to transform into digital businesses, leverage cloud, and quickly access and unify data using evolving technologies such as AI/ML and the Internet of Things (IoT).

Why is a modern AI/ML data architecture important?

Enterprises are collecting more data than ever before from existing and new internal and external sources. Traditional data environments cannot sustain these escalating volumes of data or support the following capabilities for digital businesses and their users to gain the maximum value from them:

  • Real-time, self-service data processing and analytics enable business users to be self-sufficient give them the flexibility to iteratively develop their own analytics on-demand (preventing shadow analytics). Data architectures must be built upon a dynamic and scalable data infrastructure and cloud environment to drive as much data processing and analytical capability as possible.
  • Faster access to data, which is required for discovery and analysis to yield new commercial and research and development insights, which also require state-of-the-art AI/ML capabilities.
  • The delivery of data processing and analytics at the optimal point of impact, enabling AI/ML-based predictive and prescriptive analytics integration from the core corporate data center infrastructure to the edge of the enterprise.
  • Inexpensive computing at the enterprise edge, enabling huge amounts of information to be captured. This means a mind-boggling variety of unstructured data will need to be analyzed in real-time, be it video, temperature data from an IoT solution or comments from social media.
  • More flexible and agile data architectures that require less hardware for excess compute and storage capacity and will scale to meet the real-time needs of a competitive, growing digital business. Real-time analytics needs business information on-demand that provides snapshots of what is happening within a business here and now that leads to real-time insights.
  • Support for evolving technologies that enable lower-cost, flexible, on-demand data management and analytics on AI/ML-based cloud platforms.

The opportunities of new data environments

Traditional data environments are feeling the pressures of increasing data volumes and real-time analytics demands and are unable to address the challenges of new digital business requirements. Re-architecting these data environments for greater access and scalability provides the following significant opportunities:

A unified, single “source of truth,” or in other words, one place that is held accountable to have accurate data presented in an integrated customer/product view, including accurate contracts, pricing, billing and invoices, etc.
Self-service to empower more business users to perform analytics that embeds analytical capabilities in processes, enabling analytics to be securely accessed within and outside the enterprise.
Faster access to rapidly growing volumes of incoming data, accelerating the ability to discover and analyze a variety of data sources, unstructured data, audio/video and social media feeds.
The democratization of data and analytics in any location, time or device (i.e., smartphone, tablet, desktop, etc.). Data democratization also means breaking down the silos between different business units.
Proactively managing, integrating and analyzing real-time IoT data. Internal log data also must be inspected in real time to protect against unauthorized intrusion and monitoring of software applications.
A high-capacity and scalable data platform that collects any and many different data sources, along with the ability to burst on-demand capacity and scale, which is required by most AI/ML models and algorithms.
A data lab/research platform that uses the latest AI/ML capabilities for exploratory data science efforts that further predictive, prescriptive analytics.

Transforming to an interconnection-first data architecture

An Interconnection Oriented Architecture™ (IOA®) framework for data provides you with the foundation for interconnected data. It addresses the need to localize data requirements, balance protection with accessibility, and govern data movements between on-premises and cloud technologies.

Any enterprise has three different options for putting data in the cloud (see diagram below):

  • Put new applications and data in the public cloud.
    • A faster path to the cloud compute and storage resources without concerns about legacy architectures
  • Migrate existing applications and data (“lift & shift”).
    • Takes longer to plan and deploy
    • Can be a costlier implementation, requiring greater programming resources
  • Extend the existing on-premises data infrastructure to the cloud in a hybrid IT infrastructure.
    • Quickly meets new business demands
    • Enables greater security (private and public cloud)
    • Allows burst capacity in the cloud on-demand

With many of our customers opting for the hybrid IT approach, we recommend the following guidelines:

  • Extend your data capacity and accessibility using a phased approach to move to a modern data architecture.
  • Clearly articulate target data states and the drivers of competitive differentiation in your data architecture.
  • Create foundational data capabilities in parallel with business priority-driven efforts
  • Use the Equinix Cloud Exchange™ (ECX) Fabric on Platform Equinix to connect to clouds and data anywhere at any time.
  • Store and protect your data and manage privacy regulations (e.g., GDPR, HIPPA, etc.) at a local level through optimized data and metadata repositories using Data Hub™ on Platform Equinix.
  • Learn about our Equinix SmartKey™ solutions (currently in public beta) to securely manage, protect and access your data.

A modern data architecture using interconnection and cloud technology to fast-track value delivery

The diagram below provides a data platform architecture that seamlessly integrates data and applications from various sources with cloud-based compute and storage capacity and AI/ML tools to accelerate the value that can be obtained from large amounts of data.

By considering cloud-based technologies for creating a data backbone and generating data insights that use AI/ML tools to accelerate value delivery, you can harmonize core information from disparate data sources. This will provide an integrated view of strategic information (e.g., customer, security, business intelligence, etc.) across product lines and business units, optimizing customer experience, security and revenue growth opportunities.

To learn more, read the Platform Equinix Vision paper.

Print Friendly

Related Content