Connect with us

Power Management

AI Drives Data Center Management For Global Business

According to the Gartner report The Data Center Is Dead, and Digital Infrastructures Emerge, “80 percent of enterprises will have shut down their traditional data center by 2025.” That’s compared with only 10 percent of enterprises today. The report, published in April 2018, also claims that the traditional data center “will be relegated to that of a legacy holding area, dedicated to very specific services that cannot be supported elsewhere, or supporting those systems that are most economically efficient on-premises.”

Yet, interconnect services, cloud providers, the Internet of Things (IoT), edge services and SaaS offerings continue to proliferate. Even so, the report claims that “the rationale to stay in a traditional data center topology will have limited advantages.”

Reading the report, it would be easy to think that there is a tide of revolutionary change in progress, and one that’s about changing the way organizations deliver services to their customers and to their businesses. Yet reports such as these aren’t always accurate. Artificial intelligence (AI) and machine learning (ML), for example, are now driving change in data centers. So, will they not only drive data center management, but also save the data center from its forecast demise?

Fashionable parallels

In my view, it’s very easy to draw parallels between the fashion industry and the computer Industry. Each year something new will emerge that will fundamentally change everything (or so they hope). That’s at least what the vendors want to happen. They add the prospect of losing out if you don’t conform with the current trends. Yet many of us, including myself, have been around for far too long to fall for this trap.

It’s easy to slip into “haven’t we seen this before in another disguise?” Then we come to the soothsayers. I wish I could find the article that reviewed industry predictions, and concluded that whilst most predictions came true, they happened much later than expected. They also created a larger impact than predicted. Equally so, customers are constantly told that some technologies are dead. This is a bit like the assumption that tape, or the paperless office is dead!

IT’s teenage years

From my perspective, it feels like the IT industry has never got out of its ultra-passionate, all or nothing, black or white teenage years. So where do we sit with this Gartner prediction? Well it’s possible to dismiss this as yet another plug for everything cloud and, as totally unrealistic as it may seem, data centers are here to stay. In my view, they will always exist as they have been the bedrock of computing throughout all the changes the industry has witnessed: e.g. Bureau computing, client server topologies, internet, PC revolution and the migration to the cloud. There are any other technologies holding out, too – including tape.

So, on the face of it, the Gartner report can easily be dismissed. But let’s have another look at it considering several technologies coming down the road that rely heavily on IT functions and what users expect from IT. The IT industry has tended to focus on speeds and feeds development to give that ever increasing need to reduce response times. Faster CPUs, faster close-coupled memory and I/O, solid state disks and rapid increases in network bandwidths and lastly, but not least, how to store that deluge of data we all generate each year…And if you think you have the data storage equation licked, think again, with autonomous and connected vehicles together with IoT, AI and ML there as an absolute tsunami of data heading our way.

The data emphasis

The emphasis on data is going to define the IT infrastructures in the future, and while I’d rather not agree with Gartner, I think we’re looking (with a few reservations) at the future in the same way. Traditionally, people have used data as a means to an end: In industry we absorb, we compute, and then we have the result. To put structure around it the data is then categorized into access and retention cost tiers.

Tier 1 is for the close-coupled high speed, high cost, low latency storage, all the way through to long-term, slow access archive data that might be found in a low-cost solution that’s located somewhere completely different. However, in industry I see the beginning of a transformation of how data is perceived and used. In some circumstances, data is sold as a commodity and it is delivered in a way that is very akin to a traditional manufacturing process. The playout and music streaming companies are a very good example.

After post-production and digitization Netflix, Amazon and Spotify, just like a traditional manufacture of products, warehouse their data and products in the cloud. They are then shipped out to the local edge. In the case of Netflix, the edge is your local internet service provider (ISP) to play out upon purchase. So, this equates to a simple “pay and play” architecture optimized for data delivery.

Differing requirements

The Internet of Things (IoT), autonomous and connected vehicles offer another example. You can also throw in smart cities for good measure. These have very different data and computing requirements. The data emanating from these devices is in the form of status information, and in many circumstances, such as controlling industrial processes, some of this status data is actionable. This consideration of how and when to deal with that data differs, depending on the immediacy of the data.

This opens up the questions of: Can this be done in the cloud? Or does the latency and urgency require a small computing function close by the edge, such as a large refinery does. Things can easily escalate out of control when decisions must be made across a number of IoT devices, and when there is a jammed comms link back to the cloud. Equally so and just as important, all the historic data needs to flow back to a point where it can be used as fodder for AL and ML processes, but this can be done at a more leisurely level.

Consider connected vehicle and smart cities working together to manage traffic flows and hold-ups. They are going to need some serious computing power and storage to amass all the data from potentially tens of thousands of devices such as cars, cameras, traffic flow monitors and for interaction with the emergency services. This would require a two-way flow of traffic, where information and entertainment are passed to the vehicles. Wouldn’t it be great if we could link the emergency services vehicles and traffic management systems to clear the path through our jammed-up cities? I think so.

Defining infrastructure

So how do these requirements define our infrastructure? Traditionally, there would need to be a large data center crunching away, which is all very good during the day but, what is it going to do during the night when our cities empty? This requires a much more dynamic approach, where additional resources can be added or subtracted automatically on demand; such as in the case of a large emergency.

The other area I see changing considerably is the follow-me computing function. Just a few years back, personal phone systems stored telephone numbers and text messages. Today, everyone has an unbelievable level of storage and computing functionality in the palm of our hands in our smartphones. However, there will become a point where pushing Moore’s Law will become more and more expensive.

Yet we are building more and more applications where customers will demand greater sophistication on their devices, such as a surveyor pointing their tablet at the ground and displaying the soil structure underfoot, or a paramedic scanning a patient in the ambulance and using Al to diagnose the symptoms or assessing their injuries. To meet this extra computational need, a form of supplementary computing function at the edge or near-cloud that is permanently attached to the user.In the future, everyone will have their own personal mini-compute and storage device that follows them wherever they go automatically migrating to the nearest access point – a bit like a balloon on a string. In essence, this is Balloon Computing.

The data equation

Everything we do creates more and more data. In turn, as businesses and as consumers, we will consume more and more data. Moving this ever-growing data around is very painful, whether it is to and from the cloud, data center or the edge. This pain come from the fact the speed of light is just not fast enough for the volumes of data everyone is moving. No matter how much bandwidth is thrown at the problem, once double-digit milliseconds of latency are reached, there’ll be little improvement in WAN performance without using a WAN data acceleration solution to mitigate the effects of latency and packet loss.

So, what is the future of the data center? Will everything migrate to the cloud as it is cheaper? Well, that is not always the case. The cloud can be very cost-effective when it is used wisely, but it is not a panacea for all the ills of the data center. There is no doubt that the way people use, manipulate and store data has changed dramatically; it is far more distributed than ever before and will be increasingly so. Yet, the role of the data center will evolve just in the way other aspects of IT has. Take tape for example, the evolution from online storage to nearline, to backup, to archive and the data centers will evolve likewise.

There are some key functions that will stay with the data center: one of them will be with latency critical databases. Several companies have experienced a poor response (and end-user complaints) when locating databases in the cloud that has forced them to migrate back to the data center. However, with the highly flexible distributed data and computing requirements that are going to be required in the future, the data center will morph into a command and control function.

Achieving flexibility

To achieve this level of flexibility, there must be a move away from the current method of hand-cranking everything. It’s time to employ artificial intelligence (AI) and machine learning (ML) to provide a high level of automation abstraction to create flexible dynamic infrastructures.

Moving data is going to become critical to the ability to have the data where you what it when you want it. Traditionally, WAN Optimization has been used to improve data throughput over distances, but this technology has upper bandwidth limitations. To maximize the functionality of data over high-speed networks, WAN data acceleration solutions, which use AI and ML, such as PORTrockIT are required.

Said Tabet, lead technologist for AI strategy at Dell EMC, agrees that data center managers should be looking to harness AI to find better ways to optimize data center infrastructure. Shariq Mansoor, Founder and CTO, Aera Technology adds: “Without AI, it will become almost impossible to run profitable data center operations.” It can therefore be argued that AI and ML are needed to drive data centers forward.―DCD

Click to comment

You must be logged in to post a comment Login

Leave a Reply

Copyright © 2024 Communications Today

error: Content is protected !!