Amir Michael | Chief Technology Evangelist
“The edge” is quickly becoming a popular industry buzzword right alongside “the cloud” and it’s certainly captured our imaginations, but there doesn’t seem to be much consensus on what “the edge” really means. With no standards body to define what the term encompasses, the reality is that the edge means different things to different people – it all depends on your perspective.
If you’re a hyperscaler, your view of the edge looks a lot different than if you’re a public or private Content Delivery Network (CDN) provider, just as it looks different to enterprises, government entities and so on. It’s generally agreed upon that the edge manifests as compute infrastructure is distributed ever-farther away from our central datacenter hubs. And the farther out you venture in the network, you’ll have decreasing bandwidth capacity, limited space, less power and fewer staffing resources.
Why do we need the edge to begin with? Even though we have the fiber capacity today to send data around the world, there are scenarios where it’s not practical to use expensive, bandwidth limited transcontinental fiber. It is often more efficient to offload traffic to distributed regional networks and bring data and content closer to the end users where it’s needed. In this way, the edge ultimately helps to reduce network latency and alleviate bandwidth constraints, and this is crucial for a host of reasons.
TODAY’S EDGE VS TOMORROW’S EDGE
Hyperscalers and CDNs are certainly interested in reducing latency and transit costs, even if they prioritize the benefits somewhat differently. Facebook, for example, needs to ensure that users’ Facebook pages load as quickly as possible to maximize user engagement and monetization opportunities. On the other hand CDNs of all varieties are more keenly interested in reducing the overall network load and transit costs attendant to the massive volume of content and compute services traversing the pipes.
The edge plays a big role for enterprise and government applications today too, even though we don’t typically associate them with the edge. And here again the priorities are different, with enterprises servicing satellite offices and revenue-critical retail locations. The government manages innumerable applications around the globe spanning everything from municipal and smart city functions to supercomputing and research labs to military/intelligence surveillance.
All of these applications reside at the edge today in some form, yet many conversations about the edge are framed in future-tense. These expectations center on what we call “the next-gen edge” – it’s what people envision today when they talk about edge as The Next Big Thing. Just like smartphones came along and revolutionized the way we consume the internet today, the expectation holds that the next-gen edge is going to be similarly transformative.
And it will! When it arrives. But the iPhone of the edge – the key that unlocks the door – hasn’t materialized yet. We can look forward to a 5G-enabled future where self-driving vehicles interface with roadway infrastructure and AR/VR headsets immerse us in 3D games and movies while our smart factories and transport hubs basically automate themselves with the help of sensors and UAVs…but this vision of the next-gen edge is still far off into the future. The market hype is overblown, and the killer app simply hasn’t materialized yet.
AN INDUSTRY BAROMETER
Amazon’s AWS Wavelength initiative will likely provide a good leading indicator for the next-gen edge’s arrival. It was specifically designed to service edge computing applications within ISPs’ datacenters at the edge of the network, enabling customers to reduce latency and traffic by serving/crunching the data much closer to the end users – avoiding those long hops from node to central datacenter and back again.
AWS Wavelength’s footprint is relatively light today, and Amazon’s traditional globally distributed datacenter operation remains considerably larger. With 25 AWS Regions to choose from and only 13 AWS Wavelength locations, customers will likely continue to gravitate to the Regions – and in many cases, these Region locations are closer to your customers than what’s afforded with AWS Wavelength today. But rest assured, when customer adoption increases and Amazon hits the gas pedal on its AWS Wavelength investment, there’s a good chance the next-gen edge as it’s envisioned today won’t be far behind.
The real state of the edge is that it’s already here in many ways we overlook, and yet still in its infancy when measured against the breathless expectations surrounding it while it awaits that killer app. Whatever the edge looks like to you, at the storage layer, Lightbits’ distributed architecture ensures unrivaled storage utilization – your resources are only using the storage they need, drawing from an ultra-elastic pool of storage that can be quickly allocated, balanced and shared. We provide the multitenancy and resiliency you need, with the confidence that your services won’t be interrupted while a trained technician travels to your remote POP to swap a drive.
For the next-gen edge, you’ll also need maximum storage performance to keep pace with some pretty intensive IO requirements. Lightbits leverages NVMe/TCP to service your data wherever it resides – at the edge and everywhere else – with locally-attached flash-like speed.
OPTIMIZING STORAGE COSTS AT THE EDGE
NVMe/TCP enables shared NVMe and can dramatically reduce the cost and complexity of sharing NVMe while still offering a high-performance storage solution. Micron and Lightbits believe NVMe SSDs are the future of edge computing. Because IoT devices collect and process huge amounts of data at the edge, the applications will require faster and higher capacity flash memory solutions.
NVMe SSD lowers TCO by shrinking the physical storage footprint and reducing power and cooling costs, all the while improving overall speed, flexibility, and failure rates versus hard drives. Micron NVMe SSDs are well suited for edge applications, they can deliver up to 6X greater performance at a lower cost and with greater reliability.
Lightbits LightOS NVMe/TCP software-defined storage running on standard x86 servers populated with Micron NVMe SSDs lower storage TCO while delivering the scalability and speed that can keep data flowing fast, from the edge to the core and to the cloud. To validate the benefits of the combined solution, Micron tested LightOS on Micron NVMe SSDs in their labs. To see the results of the testing you can visit the website and download the whitepaper. LightOS and modern storage deliver the scalability and speed that can keep data flowing.
To learn more about storage solutions for the edge computing applications of today and tomorrow, read these other blogs:
- How Can NVMe Over TCP Improve Edge Storage?
- The Future For Storage In The Edge Cloud
- Micron and Lightbits Labs: Collaboration Driving Data Center Transformation
###