Developing a Successful End-to-End Complex Event Processing Strategy

Bob Goban


Complex event processing (CEP) is a method of synthesizing data from multiple sources so businesses can extract meaningful patterns or trends from them. CEP is getting closer in hitting its optimum potential due to new automated business process technologies that rely on rule-based algorithms and better integration with business program management. However, the growing collection of CEP output, along with logging functions occurring at networking, security, data and application layers, are dividing service management into siloes. In this situation, how can you gain an integrated, end-to-end business view of what’s really happening?

The industry forces driving CEP

The global CEP market is expected to grow approximately four-fold from $1.41 billion in 2015 to $5.12 billion in 2020, according to Mordor Intelligence, with digital technologies (i.e., cloud, artificial intelligence/machine learning, automation data analytics) contributing to its rise. Additional industry forces driving this accelerated growth include:

  • The increasing pace of technology and digital business change is diminishing the overall end-to-end understanding of the environment. The maturity and understanding of these changes are also not aligned throughout the organization.
  • Automation applies to business processes, and more guard rails (control policies) are needed to prevent a runaway catastrophe (e.g., operational failure) from occurring.
  • Advanced persistent actors probe on different locations throughout the network over longer periods of time – a broad, corporate-wide view is needed to detect a situation when several disparate anomalies represent a coordinated event/threat.
  • Technology teams and business units are having to work more closely together and understand more about each other’s knowledge domains.

Constraints to deploying end-to-end CEP

A number of constraints keep businesses, especially those going through a company-wide digital transformation, from successfully deploying an integrated, end-to-end CEP solution. They include the following:

  1. Analytics has emerged as the only practical way to find proverbial needles in the haystack. However, this very capability presents the biggest limitation. A microscopic viewpoint defines a siloed perspective, isolating critical intelligence.
  2. When we try to manage a business scenario, which crosses multiple technology disciplines, we tend to find that each individual service appears to be working fine, yet the business overall experiences a failure.
  3. What we’re measuring and monitoring is often not what matters from a business perspective (e.g., Alert: Jane D. is experiencing a 20% reduction in CoolApp performance, this will cost the firm thousands in lost business and put us on “regulatory radar”).
  4. When we take these constraints to a multicloud, multi-organization, distributed and dynamically changing environment – running new integrated business models – it makes sense to wonder how well this is understood.

In addition, according to the Mordor Intelligence report, “…situations are intensified by an ever-expanding service industry generating enormous data volumes from a large, varied collection of distinct sources.” It’s extremely difficult for human practitioners to manage this huge amount of data and process and represent the event outcome for any one organization, let alone an entire company. As a result, CEP technology must fulfill increasingly demanding market requirements for low-latency filtering, aggregating, correlating and computing based on a broad collection of chronological and real-time streaming data. And, with all this event data processing at the digital edge-alongside intersection points, security checkpoints, data services, APIs and messaging – you need to be able to manage the big picture.

A simple solution to complex event processing

By providing enterprise architects (or business technology officers) with a CEP platform pre-loaded with insights gleaned from the event processing network, as well as security, data and application services, CEP models can be defined that offer business operational views, backed by transparency, into areas of concern. These models provide real-time dashboard views of business transactions, with secondary time-based trend reports. Business service levels can be tracked and controlled to avoid penalties by regulatory authorities. Optimization teams can now identify and investigate the weakest links.

To get started, first determine where your highest time or resource costs are, then benchmark alternatives in a business context to compare the net business improvement. New business models should be translated into similar analytical models and published and reported in real-time and batch. A digital edge node (interconnection hub) will help aggregate events and provide a localized and shared global view.

The design pattern for complex event processing at the digital edge prescribes leveraging an Interconnection Oriented Architecture™ (IOA®) strategy for directly and securely interconnecting people, locations, clouds and data at the digital edge, and allows you to deploy a flexible and easy-to-manage and -control CEP implementation (see diagram below). Begin by following the steps below:

  1. Make CEP analytical platform components a collection of cloud- and SaaS-based services. However, regulatory compliance or data sensitivity rules may require some local deployment in the digital edge node.
  2. Apply a data pipeline to aggregate events in a distributed storage repository (data lake) for batch/map reduce processing (batch layer). Map/reduce is a type of open source service that filters incoming data streams into a defined data structure that the reader has used to fill a database. This process is batched in most cases because these data sets are often very large in volume and variety.
  3. Apply streaming data services for harvesting real-time processed events (speed layer) to changes and triggers collected at the edge.
  4. Create merged views for analysis (serving layer) batched data with real-time updates.
  5. Integrate the analytics from the network and security layer for the most comprehensive coverage.
  6. Generate service views and dashboard(s) first, then define business views and operational dashboard, applying business context to the data-view for operations, security, network, business units, etc.
  7. Maintain and fine-tune all the models, linking outcomes to what the dashboard shows.

Complex Event Processing Design Pattern

The benefits of complex event management at the digital edge

When coordinated with all business and IT organizations throughout the company, integrated, end-to-end complex event processing at the digital edge provides the following benefits:

  • Everyone becomes an expert on what’s going on, even if the technology being used is only hours old, with the ability to adapt to changes.
  • CEP delivery using the steps above enables this platform to fully realize the vision of DevOps and harmonize vertical and horizontal objectives (business and IT), removing tribal knowledge and making data more available for data-driven decisioning.
  • Whether you developed any of the code or crowdsourced it all, you still have an integrated team of business technology officers whose first priority is to rapidly integrate capabilities to capture new opportunities.
  • Every decision made is based on current and accurate decision-support analytics,

and every innovation can be regression-tested, how it improved business outcomes.

In the next blog article, we’ll discuss placing predictive algorithmic services at the digital edge.

In the meantime, visit the IOA Knowledge Base for vendor-neutral blueprints that take you step-by-step through the right patterns for your architecture, or if you’re ready to begin architecting for the digital edge now, contact an Equinix Global Solutions Architect.

You also may be interested in reading other blogs in the IOA Application Blueprint Design Pattern series:

How to Localize Digital Services at the Edge for Greater Performance and QoS

Accelerating Digital Business by Deploying Application API Management at the Edge

How to Plumb your Messaging Infrastructure for Application Flows at the Edge