Currently the cloud has thousands of datacenters scattered around the world. For example, an Indian user in Chennai might be accessing his video feed for Hotstar from a datacenter in Delhi or even Singapore.
India has one of the world’s highest data usage per smartphone, fueled by rich video content. For example, Indian smartphone users now download an average of about 8.5 gigabytes of data a month – or potentially more than 40 hours of video – off mobile networks without using Wi-Fi according to Analysys Mason.
Therefore, network builds are all about content delivery with excellent customer experience. Network architects are mindful of where this content is getting generated, where it gets cached, where it gets consumed, and what is the end-user experience during the consumption.
This thought is bringing change in the way content is accessed. Driven by the need to get data and applications closer to the user, hundreds of thousands of scaled-down data centers will come up at the edge of the network to form the edge-cloud. Operators are distributing content toward the edge, thereby creating a distributed cloud architecture. This shift to the edge is necessitated by several factors, most importantly, latency and cost.
Edge-cloud will utilize caching that will respond to, or – better yet – anticipate, local streaming demands. In addition, localized edge compute will be able to host persistent or on-demand applications with strict latency constraints.
These new edge-cloud nodes will in many ways resemble large data centers. However, they may operate at a central office, a cell-tower aggregation site, or a headend. With that kind of affordable, flexible architecture in place, the edge-cloud will be able to spin up content and applications as needed.
Most content could be unidirectional, such as watching a video feed. But there is also content that is bidirectional, and response-based, such as gaming.
The type of content and the required user experience determine what edge-compute should look like, how far away from the users the edge should be created, and necessary performance functions.
In edge-cloud, location matters. Placing edge-cloud data centers near clusters of end users or devices that will benefit from higher performance and lower cost is important. Such locations might include tele-dense areas of mobile users that are streaming video, along highways for public safety applications, near factories or warehouses using manufacturing or logistics automation, or close to healthcare facilities.
The edge-cloud will be a unique ecosystem of open and interconnected data centers, which will include data center operators and service-provider partnerships. In fact, that ecosystem will underpin the edge-cloud to achieve critical mass.
The value that service providers have is the real estate and end customers. They own thousands of edge sites; they own the connect network and the infrastructure and capability to run these sites.
The capability is very regionalized, and it is extremely difficult for a data center operator to replicate. On the other hand, the value of data center operators is in their cloud experience and the software capabilities to disaggregate the compute between the data center and at the edge of the network.
The edge will be closer to the user – maybe just across town or down the street. The edge plays a key role in improving customer experience. We need to build an ecosystem where data center operators and service providers can collaborate to bring an exceptional user experience.