Best Practices to Deploy Sustainable and Resilient Data Centers at Scale at the Network Edge
The term “network edge” has been in use for decades in the networking community and refers to the interface point of computer networks and the internet and is an important security boundary. This paper focuses on the “network edge”, “a location where a local edge data center interfaces with the Internet/cloud to support data-intensive and ultra-low latency applications.”1 For simplicity, we call these data centers “distributed network edge data centers” for the rest of the paper.
As illustrated in Figure 1, in order to support data-intensive and ultra-low latency applications such as high-definition streaming media, augmented reality (AR) / virtual reality (VR), autonomous vehicles, autonomous mining, and industrial 4.0, we must place compute and storage resources at the network edge. Bringing these resources closer to the data source or consumers of the data eliminates the latency from the central core cloud data centers or regional edge data centers to the edge devices like IoT devices.2