Do You Know Where To Put Your Data?

Look to the edge to increase data performance and security

One of the biggest decisions a business needs to make is where to put all of its data. Should you keep it on-premises, move it to cloud, or edge data centers where it is closer to users, applications and analytics? And once you’ve decided where it needs to reside, there are other questions to answer, such as: How can you move data securely between these different environments or integrate them to leverage analytics for the best possible insights?

As we’ve worked with leading businesses all over the world, we’ve found, that the right answer to the “where do I put my data” question for most companies is a hybrid IT environment made up of on-premises, cloud and edge infrastructures. However, as data creation and consumption trends swing more toward the edge, away from centralized corporate data centers, it becomes increasingly important to have proximate direct and secure interconnection between data and users, applications, analytics, artificial intelligence (AI) and clouds.

The pendulum swings toward the edge

The amount of data being created and consumed at the edge is increasing rapidly. According to the Gartner report, “The Edge Completes the Cloud,” by analysts Bob Gill and David Smith:

“The ‘where to store and process the data’ pendulum (see Figure 1 below) has swung between highly centralized approaches (such as mainframes or centralized cloud services) and more-decentralized approaches (such as PCs, mobile devices and people).”[i]

Figure 1: Cycles of Centralization: The Pendulum Stops for Edge and Cloud

 The advent of digital technologies, such as cloud, the internet of things (IoT), and 5G, have paved the way for data at the edge, but data management challenges remain. These include:

  • Enabling global access to distributed data: As industries, such as retail, manufacturing and transportation, go through digital transformation, data needs to be stored, processed and analyzed at the edge on a global basis while ensuring data privacy and regional compliance.
  • Managing real-time streaming data: There are two facets of managing real-time streaming data that need to be considered:
    • Economics – With so much data being generated from devices (telemetry, telematics, device status, etc.) at the edge, it can be cost prohibitive to send all that data to the cloud.
    • Richer analytics – Deploying analytics and AI models across diverse data sets (e.g. real-time weather information, customer buying trends, social media feeds, etc.) is important because of the richer insights it can generate for businesses.
  • Deploying hybrid/multicloud data repositories: Most enterprises started their journey to cloud by moving workloads to a single cloud platform. However, as cloud providers started to competitively differentiate themselves from each other, it created a desire to manage certain workloads using best-in-class cloud services. Additionally, there are now corporate policies mandating the use of multiple cloud providers. That creates an ongoing need for enterprises to be able to move data from cloud-to-cloud with agility. This can not only be expensive due to egress charges, but also architecturally inefficient.

Companies can handle these and many more data management challenges by:

  • Tying globally distributed data deployments together in a robust, consistent, high-performance and compliant manner.
  • Partitioning data processing between on-premises workloads, cloud and edge data centers, based on application and cost analyses to help manage the high price of data ingress and egress charges.
  • Keeping data close to where users, applications, analytics, AI and clouds reside to lower latency and improve performance when exchanging and integrating data sets, and applying applications to streaming data in real-time.
  • Leveraging a vendor-neutral, cloud-proximate interconnection platform where organizations can run stateless applications (not dependent on saving data when moving from session-to-session) on multiple clouds against the same data set.

By creating distributed data repositories and database nodes in strategic locations at the edge that bypass the public internet and leverage high-bandwidth, low-latency private interconnection, businesses can see immediate gains in cost efficiency, performance and security.

Managing your data at the edge

Equinix is an integral part of today’s edge architecture. Specifically, our customers leverage Equinix as an interconnected, aggregated edge data platform, made up of various digital and business ecosystems. Our 200 global International Business Exchange™ (IBX®) data centers locations are strategically close to customers’ remote edge locations while at the same time being proximate to ecosystems such as public clouds, network providers and other business partners (see Figure 2 below).

Figure 2. The Interconnected Aggregated Edge

From a data management perspective, businesses can place their edge data and workloads on Platform Equinix® to:

  • Leverage multiple ecosystems of networks, clouds and business partners to integrate distributed data repositories and gain richer insights across all data sets from analytics and AI.
  • Store, process and analyze streaming data from digital data sets locally, then move portions of that data or metadata to the cloud as it makes sense, based on a company’s economic priorities, key business applications and local compliance regulations.
  • Harness a vendor-neutral hybrid/multicloud interconnection platform to segment data processing and analysis, using the best cloud service for the job.

One example of how local proximity to data at the edge can improve data access is data center information technology and security audits. “The Total Economic Impact™ of Equinix,” a commissioned study conducted by Forrester Consulting on behalf of Equinix, published in April 2019,[ii] created an aggregate, composite organization from customer interviews and financial analyses to illustrate the financial impact of using private interconnection solutions on Platform Equinix. The study showed that by distributing their data at the edge with Equinix, organizations could complete complex information technology and security audits 60% faster. By using Equinix metro locations as data transit points, it is no longer necessary to backhaul data to centralized on-premises servers, reducing the length and complexity of data center audits.

The Data Blueprint shows you how to localize your data at the edge and balance accessibility with data protection and govern data movement and placement between on-premises, colocation and cloud infrastructures.

You might also be interested in reading the following blogs:

How to Converse in Cloud: Cloud Storage

How ASE, Equinix and Net App Deliver Reliable Multicloud and Storage Performance

The Cloud Storage Performance Dilemma

 

[i] Gartner, “The Edge Completes the Cloud,” Bob Gill & David Smith, 14 September 2018.

[ii]The Total Economic Impact™ of Equinix,” Forrester Consulting, April, 2019.