How to Establish a Neural Data System

Enable next generation data by integrating multiple services with private interconnection

Jason Sfaelos

The world of data is evolving at a staggering pace and everyone is trying to keep up with its creation, management and value extraction. IDC predicts that, by 2025, the global datasphere will grow to 175 zettabytes, which is more than 10x the amount of data generated in 2015.[i] Storage, historically the dominant part of the data conversation, is now one component of many. The data ecosystem is evolving to include other critical elements, such as blockchain, artificial intelligence (AI), machine learning (ML), edge analytics, and data fabrics, catalogs and exchanges which can be integrated to deliver greater value.

Equinix is creating a conceptual framework to integrate data storage with these elements and more to meet the needs of a myriad of enterprise use cases that that require safe and secure data exchange across multiple services to turn data into value. With that said, I am excited to introduce the Neural Data System.

Accelerate Digital Transformation Through AI: The Why, What and How?

Experts from IDC, NetApp, Nvidia and Equinix talk about the immediate and future impact of artificial intelligence (AI) in this webinar. Learn why enterprises are moving from centralized to distributed AI models, best practices for implementing hybrid cloud architectures that support AI, and much more.

Read More
ai-brain

What is a Neural Data System?

The human nervous system and how it interacts with the environment, as shown in the diagram below, provides a good illustration of how a neural data system works:

  • Perceive: Data Ingest is similar to the way humans use our senses or perceive the world. For example, IoT devices capture all types of data about the world around us.
  • React: Edge Analytics is similar to the reflex reactions humans have based on certain stimuli. Our reflex reactions are either ingrained in our DNA (e.g. covering the back of your head when hearing a loud noise) or learned based on our experiences (e.g. don’t touch the stove!). Edge analytics allows for AI models to be placed at the digital edge where data is ingested close to its creation point which allows for automated decision making and processes to be kicked off in near real-time.
  • Memory recall: Data Tagging or Data Catalogs is/are similar to a human’s ability to recall distant memories stored in different areas of our minds based on simple, short triggers. Data tagging allows for a new form of metadata as a service to be leveraged to tag data as its being ingested to allow other applications to pull relevant data that is required later.
  • Learning: ML / Deep Learning (DL) enable a deeper look at an extended amount of data that is then used to create new AI models both centrally and at the edge. This action is often referred to as “training” AI models or, to reverse the perspective, the model is learning – just as a human would – to better understand how things work and perhaps to change the automated/reflex actions to stimuli. Sometimes this includes providing AI models with contextual awareness.
  • Neurons: Data Fabrics are similar to how a human’s network of neurons in the brain move our impulses as they are required. Data fabrics perform critical functions focused on moving data based on system requirements and parameters in a highly efficient manner.
  • Memory: Data Storage is obviously related to the maintaining of data over time in different formats and just like our brain, there are different types or locations (short/long term memory) involved.
  • Teaching/Creating: AI/Data Exchange is similar to teaching, which, while not part of the human nervous system, is a key enabler of the human intellect. AI/Data Exchange allows for entities to come together to share and exchange data and algorithms, in many cases creating new value.

How to set up a neural data system

It’s helpful to understand how each of these pieces fit together as a starting point. The following diagram illustrates how each of these components intersect with one another.

Now we’ll walk through the steps for setting up a neural data system.

1. Data Ingest, Collection & Provenance

Data is ingested or collected from various sources around the world. Streaming data and/or continuous discrete data flows can come from a variety of sources such as internet of things (IoT) sensors, point of sale (POS), video and image files or bulk data that you see on the right. Streaming data examples include:

  • A financial institution tracks market changes and adjusts settings to customer portfolios based on configured constraints (such as selling when a certain stock value is reached).
  • A power grid monitors throughput and generates alerts when certain thresholds are reached.
  • A news source streams clickstream records from its various platforms and enriches the data with demographic information so that it can serve articles that are relevant to the audience demographic.
  • An e-commerce site streams clickstream records to find anomalous behavior in the data stream and generates a security alert if the clickstream shows abnormal behavior.

For added security, blockchain can be used to provide an immutable ledger to demonstrate provenance of the source of what created that data – e.g. a “known” IoT device, along with confirmation of its current security software updates, admin control, etc.

2. Edge Analytics

Placing AI models at the metro-edge where data is being ingested enables real-time analytics and responses while driving toward localized model building. Regional/metro-edge sample data sets or their reports are sent to the core for global model building and updates.

3. Data Tagging

Data is then tagged based on its contents and policies for use, storage, etc. through data catalogs, also known as Meta Data as-a-Service (-aaS). These data catalogs are linked to one another globally and are in constant sync.

4. Data Fabric

A global data fabric abstracts the physical storage location away from the data itself. A single name space allows the data and systems to be viewed as the same location which simplifies a company’s data management workload and complexity. Leveraging an interconnection service such as Equinix Cloud Exchange Fabric™ (ECX Fabric™) can provide a data fabric with greater performance, control and security. ECX Fabric enables organizations to easily deploy hybrid multicloud infrastructures and securely connect to network and cloud ecosystems on a global, interconnected fabric.

5. Data Storage

This global data fabric will help move the ingested data to the different storage options available, both in terms of the type (object, file, block & database) and the location (cloud vs on-premises vs a specific geography).

6. Data Lookup and AI/Data Exchanges

Corporate applications and data science toolkits leverage data catalogs similar to how a DNS lookup takes place. In this case, the application requests data based on its metadata and points to the data’s correct location. Simultaneously, data is intelligently moved based on how it needs to be accessed, while maintaining compliance with coded regulations within the data fabric.

AI/data exchanges or marketplaces enable data and algorithms intended for sharing to be sent through to the appropriate gateway. Blockchain can be used here as well to track all exchanges of value, counterparties and specifically what was exchanged for audit and scaling purposes. It can also pass on the provenance of where the data or AI algorithm came from using the original blockchain transactions.

Bringing it all together safely and at speed where it’s needed most

This integrated view of a neural data system offers a clear roadmap and plug and play framework for the future of data. Even if an enterprise isn’t prepared for the entire system, they can now see how the components they need will be part of a larger whole. They can start with any one (or multiple) component(s) with a view of how their architecture will evolve over time. Service providers providing multiple, single or even niche components can also better understand how their solution augments the larger system. In the end, data that is properly leveraged both internally and externally will have tremendous value, moving businesses away from a siloed view toward greater collaboration and efficiency.

To be successful, a neural data system needs to provide its participants with safe and reliable access to its data at speed. Private interconnection can help ensure trusted data exchange between participants, while improving performance, scalability and resilience.  Global interconnection solutions, such as those found on Platform Equinix®, provide private connectivity to a dense ecosystems of providers and enterprises for low-latency, secure exchange of data, insights and AI models. 

Watch the webinar on Accelerating Digital Transformation with AI to learn more.

You may also be interested in reading our blogs on data exchanges and AI.

 

[i]IDC, Data Age 2025, sponsored by Seagate, Doc ID #US44413318, Nov 2018; Microstrategy, How Much Data by 2025?, Jan 2020.