BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Classifying The Modern Edge Computing Platforms

Following
This article is more than 3 years old.

A decade ago, edge computing meant delivering static content through a distributed content delivery network (CDN). Akamai, Limelight Networks, Cloudflare and Fastly are some of the examples of CDN services. They provide high availability and performance by distributing and caching the content closer to the end user’s location. 

The definition of the edge has changed significantly over the last five years. Today, an edge represents more than a CDN or a compute layer. It has evolved into an extension of the public cloud running in extremely diverse environments and computing contexts. 

MORE FROM FORBESCritical Capabilities For Edge Computing In Industrial IoT Scenarios

Before we take a look at the classification of edge, let’s understand the evolution of edge.

Four factors influenced the evolution of modern edge:

  1. Cloud - Cloud computing provided compute, storage and networking services to businesses. Object storage services such as Amazon S3, Azure Storage and Google Cloud Storage became the origin and source for storing content used by workloads hosted in the cloud. CDN became a logical extension of object storage to distribute and cache content across a network of edge locations. Amazon Cloudfront, Azure CDN, Google Cloud CDN are examples of content delivery networks extending the object storage capabilities. 
  2. Industrial IoT -  The rise of Industrial IoT (IIoT) resulted in the introduction of IoT gateway - a specialized device that translated protocols used by local devices to the cloud protocol. A classic example of such translation includes converting OPC UA to MQTT and mapping Modbus and CAN bus to TCP or UDP. The IoT gateway also acted as a data aggregator by combining and multiplexing the telemetry streams from multiple devices and filtering that before streaming to the cloud. 
  3. Artificial Intelligence - More recently, AI became a key component of IIoT. By deploying deep learning models at the edge, organizations are able to perform inference in real-time. Predictive maintenance - an approach to detecting failures in devices and machinery before the actual disruption - needs faster turnaround time. Performing the inference of telemetry streams in the cloud is not only slow but also expensive in terms of bandwidth cost. IIoT customers want to run AI models locally by keeping them closer to the devices which act as the origin of data.
  4. 5G Networks - 5G networks are game-changing for organizations in the manufacturing, healthcare, retail and automotive industries. The infrastructure running at a telecom provider’s facility connected via the 5G network has low latency. Telecom providers are moving towards a multi-tenant, hosted infrastructure layer that bridges the gap between the cloud and the end-users. Public cloud providers such as Amazon, Google, IBM, and Microsoft are partnering with telecom companies to bring some of the managed services to the 5G-based edge location.

The above trends have changed the definition of edge computing by broadening the scope and expanding the boundaries of the edge. 

An edge computing layer runs anywhere between the devices layer and the cloud layer. 

Here are six forms of edge computing that cover the whole spectrum spanning the devices to the cloud:

Micro Edge

The micro edge is the most recent incarnation of the edge computing layer. When a microcontroller is capable of running a TinyML AI model, it qualifies to be a micro-edge computing device. In this use case, the sensors connected to the microcontroller generate the telemetry stream that is used by a deep learning model for inference. Unlike other scenarios where the microcontroller collects the telemetry and ingests into an edge computing layer, this type of edge runs within the context of a microcontroller and a microprocessor. 

Examples of hardware and software stack include TensorFlow Lite running on a combination of ARM Cortex-M processor co-located with a microcontroller such as nRF52840 or Apollo3 Blue.

Mini Edge

The mini edge is based on a single board computer built on either ARM64 and AMD64 architecture. It is typically powered by an AI accelerator to speed up the inference. It’s also capable of running a full-blown operating system such as Linux or Microsoft Windows. Mini edge comes with a software stack associated with the AI accelerator. These types of edge devices are ideal for protocol translation, data aggregation, and AI inference. 

Examples of mini edge include an NVIDIA Jetson Nano module running NVIDIA JetPack and TensorRT or an X86 board with Intel Movidius Myriad X Vision Processing Unit (VPU) running Intel’s Distribution of OpenVINO Toolkit. Mini edge runs in disconnected, mobile and remote environments such as trucks, vessels, air crafts, and windmills.

Medium Edge

The medium edge deployment model represents a cluster of inexpensive machines running at the edge computing layer. The compute cluster is powered by an internal Graphics Processing Unit (GPU), Field Programmable Gate Arrays (FPGA), Vision Processing Unit (VPU), or an Application Specific Integrated Circuit (ASIC). A cluster manager like Kubernetes is used for orchestration of the workloads and resources in the clusters. 

Medium edge examples include a Kubernetes cluster of Intel NUC machines or Zotac Mini PC connected to NVIDIA GPUs or Intel Movidius VPUs deployed in a retail store or a restaurant. They may run NVIDIA CUDA/TensorRT or Intel OneDNN/OpenVINO models for AI acceleration.

Heavy Edge

The heavy edge computing device is typically a hyperconverged infrastructure (HCI) appliance that runs within an enterprise data center. It comes with an all-in-one hardware and software stack typically managed by a vendor. Heavy edge demands power and network resources that are available only in an environment like an enterprise data center.

Heavy edge doubles up as an IoT gateway, storage gateway, AI training and inference platform. It comes with an array of GPUs or FPGAs designed for managing end-to-end machine learning pipelines including the training and deployment of models. 

AWS Snowball Edge, Azure Stack Edge, NVIDIA EGX A100 and Nutanix Acropolis are some of the examples of heavy edge. 

Multi-Access Edge

Multi-Access Edge Computing (MEC) moves the computing of traffic and services from a centralized cloud to the edge of the network and closer to the customer. With 5G becoming a reality, MEC is becoming the intermediary layer between the consumers and providers of the public cloud. 

MORE FROM FORBESAzure Stack Edge Gets NVIDIA GPU To Accelerate AI At The Edge

With MEC, the edge infrastructure runs within the facility of a telecom provider co-located within a data center or even a cellular tower. This is delivered via a managed services either by the telco company or a public cloud provider. 

MORE FROM FORBESA Closer Look At Google's 5G Mobile Edge Cloud Strategy

AWS Wavelength, Azure Edge Zones and Google’s Global Mobile Edge Cloud powered by Anthos are examples of MEC. 

Cloud Edge

Cloud Edge does what CDN did to static content but for dynamic workloads. It enables distributing components of an application across multiple endpoints to reduce the latency involved in the roundtrip. Cloud Edge relies on modern application development paradigms such as containers and microservices to distribute the workload. Static content and stateless components of an application are replicated and cached across the global network. 

The cloud edge providers may support AI acceleration as an optional feature. Since it is delivered as a managed service, customers won’t have to deal with the hardware and software maintenance. 

Section.io, Volterra.io, Mimik and Swim.ai are some of the providers delivering cloud edge capabilities. With increased investments in hybrid and multi-cloud environments based on Kubernetes, mainstream cloud providers may eventually offer cloud edge services as an extension of their existing platforms. 

The definition of edge computing and the ecosystem are rapidly evolving to meet the demands of enterprise customers.

Follow me on Twitter or LinkedInCheck out my website