Data is booming. It’s not news to anyone, but the amount of data required by everyone connecting to the Internet is increasing rapidly. This trend is affecting the way data centers are constructed in some unexpected ways, particularly regarding how large data centers will become. Because cloud and hybrid computing are using machine learning and AI and thus require servers to employ tremendous amounts of data, those computing needs are still increasing. Until now, facilities have been getting bigger to meet those needs. But bigger no longer means taller—it can often mean wider.
The generally accepted strategy of stationing monolithic data centers in the largest cities is endangered because despite enormous advances in technology, there’s just too much data. The needs of cities and towns beyond massive population centers are growing, and the businesses and customers located there are moving too much data of their own to be served entirely by large, remote data centers. Instead, the increasing draw of edge data centers is creating a scenario where the number of facilities will increase while the average size will decrease. This is the future of data centers: large-scale computing and small-scale facilities.
You might think hyperscale is an example of data centers getting bigger instead of smaller. But hyperscale data centers are designed from the ground up for intensive scaling that doesn’t involve expanding to more floors in the same building. They implement servers that can be swapped in and out at a moment’s notice, cooling systems that can handle a wider range of temperatures, and power to support the exponential needs of improving technology. Hyperscale is a reason not to build up, because it’s designed to be a powerhouse that fills its walls with as much power as possible while being efficient in both data and energy.
Edge data centers—small, remotely managed facilities that are often part of larger deployments—are the best way to meet the need of individuals and businesses. The edge trend was in danger a year ago when companies seemed intent on selling off all of their server space to cloud giants such as Amazon, Google and Microsoft. But now, not only are companies slowing in their sales of their own data environments, but more and more of them are pushing into small, local data centers. The needs of small companies are becoming louder because their approach to dealing with latency is a problem that a variety of organizations can tackle and profit from.
Edge computing is also a solution for the coming wave of IoT devices, which are already running into difficulties when they must rely on remote cloud systems. Edge servers, on the other hand, can process information from IoT sensors and devices locally, meaning not only faster data collection but faster processing and analysis.
This trend is also inseparable from the growing mistrust of enormous cloud services. Facebook CEO Mark Zuckerberg’s testimony to the U.S. Congress as well as the implementation of the EU’s GDPR are other examples of these services coming under fire. Edge computing represents both security as a selling point and security as a strategy for businesses and their users: having quicker access to your own data, conveniently stored in a location guarded by your own security protocols.
Edge computing is a powerful part of the computing economy already. It absorbs an important demand of computing as we know it, which is for more data to be available more quickly and in more places. The exponentially increasing amounts of data that are necessary for more-complex everyday computing tasks are pushing even the most expensive hyperscale data centers to their limit. The industry seems to be settling on small data centers as a powerful option and spreading them out so that fast computing isn’t concentrated in large cities.
Edge data centers are becoming more powerful as they become more numerous, and their ability to sustain advancements in AI and virtualization has made them critical pieces of infrastructure for businesses looking to maintain their own data sources and data processing. These facilities are increasing in number as companies slow their sale of in-house data centers in the wake of cloud-service scandals and increased regulation.
All of these trends point to a computing future that’s much wider than it is tall. Computing is moving outward, and capabilities are advancing at a rate that makes the idea of a high-rise data center difficult to imagine. We simply won’t need them, and securing data center properties in a wider variety of places seems to be a much better bet than building up current properties.
Steven Cooke is Managing Director for the USA West Region at Linesight, a global construction consultancy that provides independent cost-management and general consultancy services to the U.S. construction industry. A native of Edinburgh but based in San Francisco, Steven is a chartered surveyor and has more than 20 years’ professional experience in construction cost management. – Data Center Journal