Data Storage 101: Edge Cloud Explained – Top Benefits and Considerations

Have you ever wondered how data gets onto your phone so quickly? Or, for that matter, how the Internet of Things (IoT) devices, streaming video to your mobile device, and smart city technologies interact at lightning speed with far-off information systems? The answer is “the edge,” that zone where network access is closest to an end-user or connected device—the network’s edge, so to speak. 

Changes in consumers’ use of devices, along with the advent of 5G and the IoT, among other things, have driven a huge increase in the size and sophistication of edge infrastructure. Today’s edge is comprised of thousands of small data centers and antennas sitting atop bare metal computing infrastructure. The concept of the edge is evolving rapidly, too, with seemingly endless growth in its reach and complexity. 

The edge cloud represents one of the newer manifestations of the edge. Edge cloud deploys a cloud architecture into edge infrastructure. Yet, it’s more than just a small-scale version of a hyperscale cloud facility parked in a remote place. This blog explores some of the things that make the edge cloud distinctive, as well as six considerations to keep in mind when selecting an edge cloud solution. 


What is Edge Cloud?

To get at a meaningful definition of edge cloud, it’s helpful to recall that the cloud is, at its core, a software architecture. Industry buzz and the predominance of massive cloud computing businesses tend to distort our understanding of the cloud’s true essence. The cloud is an architecture and approach to computing first. It’s a business and organizational culture second. It is designed to allow management and efficient use of applications and their required compute, storage, and network resources.

An edge cloud, therefore, is an implementation of cloud architecture at edge infrastructure sites. For example, cables coming off a 5G antenna may connect to a small, nearby building in the middle of a city. Inside the building are a few server racks, making it a so-called “micro data center.” You could run an edge cloud on those servers. That would consist of complete cloud architecture, including cloud computing, storage, and network, managed by a cloud operating system. It’s a cloud. It’s at the edge. It’s an edge cloud. It may be connected with other edge cloud instances, but it doesn’t have to be.


What is the difference between Edge Cloud and Edge Computing?

Edge cloud is different from edge computing. They are related concepts, but edge computing is broader in scope. Indeed, edge computing could be almost any sort of digital activity occurring at the edge. It might be a Content Delivery Network (CDN), an IoT data collector, edge caching of app data, software supporting a telecom company’s 5G services, and so forth. In contrast, edge cloud refers to a cloud architecture running at the edge. Of course, the two overlap. Edge cloud is a subset of edge computing. And an edge cloud could theoretically run almost any kind of edge compute process imaginable. 


Are edge clouds better than central clouds or central datacenters?

One common question about edge cloud relates to why it is even needed in the first place. If you can connect a device to Amazon Web Services (AWS), for example, why would you need a minuscule data center a few blocks away, a microcosm of AWS running on a few racks? Alternatively, you might have an impressive corporate data center that will easily connect with remote devices. Why bother with an edge cloud instance? 

A number of issues explain why it is often better to use an edge cloud instead of a central cloud or data center. For one thing, any application that needs low latency is going to require a fast response time from whatever infrastructure it calls on. For example, successful artificial intelligence (AI) outcomes rely on low latencies and may not be as successful with high latency to a central data center. The more data that can be analyzed in a short amount of time, the better the AI outcomes. Take the example of a smart home. By using an edge computing platform and bringing the processing and storage of data closer to the smart home, backhaul and roundtrip time is reduced, and information can be processed at the edge. Thus the time is taken for voice-based assistant devices, such as Amazon’s Alexa which are common in smart homes today, the response would be much faster.

Using an edge cloud also keeps bandwidth required for cloud at low levels. The edge cloud reduces traffic on the network, and resulting bottlenecks, by keeping data close to edge devices that need it. The same is true going in the opposite direction. Cloud ingress bandwidth for data generated at the edge is similarly low. With IoT sensors, for example, a great deal of data originates at the edge. There may not be a good reason to move it across the network to a central cloud or data center. The edge cloud can also reduce the volume of network traffic by pre-processing edge data before shifting it to another place. 

Connectivity may be limited at the edge, too, which favors edge cloud over a central cloud or data center. This was once a problem limited to rural areas. However, as 5G and related device congestion start to overload network capacity, it’s an issue in the city as well. 

Top 6 considerations when choosing the right Edge Cloud

Implementing an edge cloud comes with a number of challenges, most of which can be addressed. Here are the top six considerations to keep in mind when choosing the right edge cloud solution:

  • Total Cost of Ownership (TCO)—Cost always matters in IT. With edge cloud, drivers of cost include a choice of hardware, storage, support, and more. It may make the most sense, from a TCO perspective, to use standard hardware for compute, networking, and storage. It might also pay to disaggregate storage to allow for independent scaling of storage and compute. 
  • Heterogeneous applications—It is effectively impossible to predict what kinds of applications will need to run on an edge cloud in the future. The best practice is to design for the lowest common denominator of use cases. That way, your edge cloud will have the maximum potential to run heterogeneous applications.
  • Limited serviceability—Edge locations can be difficult to service. This is often a matter of distance. Having sites spread out across a wide area makes it costly to roll trucks out to conduct maintenance. It could also be about the sheer number of sites to service. Some telecommunications providers, for example, are contemplating having to support tens of thousands of small data centers across the country. A wise choice would be a software-defined implementation for high availability which also could increase the endurance of the hardware and storage media.   
  • Density and floor space requirements—High density will contribute to success in an edge cloud instance. Some of the physical spaces being built out for edge cloud are quite small, so it’s necessary to achieve high density, coupled with flexibility in form factors. This might mean deploying high-capacity flash drives, for example, for edge cloud storage.  
  • Agility—Agility in an edge cloud is partly about preparing to handle unknown future workloads and applications. Being agile in the edge cloud also involves being ready to move quickly and economically in reconfiguring the cloud’s component systems as requirements change. A further dimension of edge cloud agility relates to having cloud management tools that can easily and quickly adapt to evolving market needs.
  • Security and privacy—Edge cloud instances present an attractive attack surface for malicious actors. Security countermeasures must therefore be robust in an edge cloud. Given that edge cloud instances are not located inside a secure facility, this will mean hardening them against a wide array of threats. These range from physical intrusions to denial of service (DoS) attacks and any number of penetration techniques. At the same time, security management and operations have to be efficient. Automated remote patching, for example, is a non-negotiable security requirement for an edge cloud. If security managers have to issue patches manually to tens of thousands of edge cloud sites, that will negatively affect the cloud’s security posture. 


The edge is growing across multiple dimensions. As edge compute workloads increase in scale, they are also expanding in terms of sophistication and complexity. Expectations of performance and latency are also trending upward. Users demand extreme performance, while system owners want edge computing that is flexible enough to accommodate shifting consumer tastes and business use cases. The edge cloud offers a compelling solution. It combines the agility of the cloud with the low latency of the edge. Making an edge cloud into a success, however, takes some careful planning and thinking through short- and long-term goals. By focusing on TCO, agility, density, security, and serviceability, it is possible to create and deploy an effective edge cloud solution.

Additional Resources

IDC Innovators in NVMe®/TCP
Cloud-Native Storage for Kubernetes
Disaggregated Storage
Ceph Storage
Persistent Storage
Kubernetes Storage
Edge Cloud Storage
NVMe® over TCP

About the Writer: