With a proliferation of internet of things (IoT) use cases and 5G right around the corner, we’re hearing a lot more about the “edge” – as in edge computing, digital edge, micro edge, mobile edge and metro edge to name a few. It can be a bit mystifying, even for the geeks among us, so grab hold of the “edge of your seat” while we clear things up in this quick guide.
Gartner – Exploring the Edge: 12 Frontiers of Edge Computing
Infrastructure and Operations (I&O) leaders who manage cloud and edge infrastructure should develop a multiyear edge computing strategy that encompasses the diversity of use cases needed by their digital organizations.Read More
Where is the edge?
Edge computing has been around since the late 1990s with the advent of content delivery networks that deployed servers closer to users to reduce latency, improve performance and lower network costs. The “edge” is typically defined as the outside limit or the place where a change of status or control is encountered. From an IT perspective, that means the edge of an organization’s IT architecture where data is exchanged with users, partners and other systems. But it’s a moving target. As technology advances, enabling new use cases and workflows that previously weren’t possible, the edge is also evolving. Here are a few things to know about the evolving edge:
Digital edge: This is where the physical meets the digital – the locations where the users are, where data and information is consumed, generated and transferred between users and applications.
Edge computing: Traditional IT infrastructures had centralized compute and storage resources which required data exchanged at the edge to be backhauled to the center/core for processing. Edge computing reverses this paradigm by distributing compute and storage closer to the edge where data is generated and consumed. One useful definition comes from Dr. Karim Arabi, former Vice President of Engineering at Qualcomm. In his definition, cloud computing operates on big data while edge computing operates on “instant data”, or real-time data generated by sensors or users.[i]
Edge of the network: The edge of the network depends on who the owner is – for a corporate network, this could be the edge of the local area network (LAN) or wide area network (WAN) managed by the IT department. For network service providers (NSPs), it is typically where operators connect to their customers and partners such as the central office for carriers. This could also be the point at which aerial networks (satellite, radio, wireless) connect to terrestrial or subsea networks/cables.
Wireless edge: The wireless edge refers to bringing real-time, high-bandwidth, low-latency access to latency-dependent applications, distributed at the edge of the RAN network and enabled by multi-access edge computing (MEC). Formerly known as “mobile edge computing”, MEC was originally defined by the European Telecommunications Standards Institute (ETSI) as providing “IT and cloud-computing capabilities within the Radio Access Network (RAN) in close proximity to mobile subscribers.” Today it applies more broadly to the “edge of the network,” and ETSI currently defines it as “cloud-computing capabilities and an IT service environment at the edge of the network.”[ii]
What’s driving the edge?
The convergence of several key trends is accelerating edge computing. These include:
Distributed applications: Stateless applications are becoming more prevalent, driving the need for increasingly dynamic and distributed systems that require more local processing and storage, as well as improved network performance and capacity.
Emerging technologies and use cases: Advances in artificial intelligence (AI), machine learning (ML), the internet of things (IoT) and virtualization are making it easier to deploy and run a wider variety of use cases at the edge such as connected cars, drones, remote healthcare and edge analytics.
Data volume: Of course, more use cases = more data! By 2021, Cisco predicts that creation of useful data will outstrip global data center network transport by a factor of four to one (85 vs 20.6 zettabytes).[iii] A dramatic increase in data volume means that the edge will need to evolve to support proximate computing and service exchange for optimal user experience.
5G and evolving networks:5G opens the door to significantly improved data transfer speeds, low-latency connectivity, capacity and reliability, but it also requires new network architectures that can handle more data exchange points. This is driving network service providers (NSPs) to deploy new physical aggregation points to address latency and data throughput requirements. Equinix recently launched a 5G and Edge Proof of Concept Center (POCC) in Dallas to enable testing and comparison of 5G architectures.
How does this impact data center design?
In design, it often boils down to a tradeoff such as performance vs cost. While traditional data centers focus on maximizing space and power to support large deployments that need to scale up or down quickly, edge computing demands a different approach. Emerging technologies such as artificial intelligence (AI) embody the intersection of latency-sensitivity, large data volumes, critical workflows and a level of distribution not previously delivered on a wide scale. Addressing requirements for applications such as these requires a greater level of analysis, operational understanding, and practical know-how when compared to existing models of support. Taking a deeper look at the infrastructure requirements in the sub-40-millisecond realm will provide us with some additional clarity:
Metro edge: Also known as the regional hybrid core, the metro edge is where digital ecosystem participants meet to exchange traffic in proximity to major population centers in hybrid multicloud architectures. These are usually traditional data centers.
Cloud edge: CSPs already operate at the regional hybrid core to enable the greatest flexibility in connection and high-performance access to their services. Now they are working to drive that edge even closer to their customers through on-site hybrid delivery and boosting their capabilities at the regional edge. They are doing this by leveraging highly interconnected sites to ensure quality of service, while driving compute power closer to end-point devices.
Micro/modular edge: Metro edge data centers were previously considered to be the edge but that is changing. The micro data center (MDC) or modular edge data center (MEDC) of today is a very small, typically un-staffed facility that allows for highly-localized placement of mission critical infrastructure. The combination of a small, flexible footprint, interconnection and exchange, along with the ability to support applications with modern edge demands, makes MEDC a challenging yet rewarding design that could help propel the most sophisticated IT architectures to come.
To see how Equinix is approaching modular EDC design, check out “A Quick Guide to Choosing the Right Data Center Design.”
Read the Gartner report to find out more about the 12 Frontiers of Edge Computing