This year, the network teams are poised to be implementing advanced platforms, tools, and methodologies that bring aging networks into a new era.
Interconnection is expected to become more and more integral to business infrastructures as digital transformation continues to drive the business ecosystem. The explosive growth of internet-connected devices, along with new applications that require real-time computing power, are forcing businesses to rethink the ways they manage and transport data. As businesses continue to turn digital, they are moving their computing from centralized data centers to a hybrid infrastructure at the edge. At its basic level, this means moving computation and data storage closer to the devices where it is being gathered, rather than relying on a central location that can be thousands of miles away.
This transition requires moving from a centralized IT services model to one that is geographically distributed and regionalized with cloud, resulting in a hybrid multi-cloud infrastructure. The convergence of significant macro, technology, and regulatory trends has made it more complex for businesses to make this transition to the edge. Most businesses today are successfully navigating their way by deploying private traffic-exchange points between counterparts – otherwise known as interconnection services. This is because leveraging multipoint connectivity via direct, private traffic-exchange points between users and local services, gives business ready access to third-party, cloud-enabled apps, and analytics services.
According to Equinix’s Global Interconnection Index (GXI) Volume 3, capacity is expected to reach 13,300+ Tbps by 2022, which represents a 51 percent compound annual growth rate. This leap is consistent globally and across all industries, including energy, healthcare, government, and manufacturing, pointing to the universal need for interconnection services. The GXI identified that when businesses spend greater than USD 50,000 per month on distributed IT services, the need for interconnection bandwidth capacity increases 4x on average to support real-time interactions. For businesses operating in more than three countries, the GXI forecasts a 5x increase in the interconnection bandwidth required to locally connect data sources and security controls to meet data compliance regulations and reduce cybersecurity vulnerability points.
As businesses continue to interconnect at the edge, they are setting higher standards for business performance; and as customer and partner expectations rise, the shift to an interconnected, edge-first workload and application architecture will be essential for a digital-ready infrastructure that helps enterprises stay competitive.
Wi-Fi 6 adoption
In the meantime, the Wi-Fi 6 era is nearly here. Wi-Fi 6, which will eventually supplant the 802.11ac (or Wi-Fi 5) standard that most current networks and devices run, promises several innovations that increase speeds, throughput, and spectral efficiency.
With more than 13 billion active Wi-Fi devices deployed globally, accounting for more than half the world’s daily internet traffic, the advent of this new specification is a big deal. And enterprises – which need to manage the soaring number of devices that connect to wireless networks in ways that reduce costs, automate operations, and resolve problems faster – are especially excited. However, they need to be wary of letting that excitement get the best of them and deploying early Wi-Fi 6-enabled products that may end up as paperweights because of compatibility issues.
Wi-Fi 6 access points are hitting the market – which is great – but not all are created equal. Businesses need to be careful about purchasing and deploying them unless the vendor can assure the products will be software-upgradable to support the Wi-Fi 6 certification requirements.
The Wi-Fi 6 features supported in the early products that vendors are releasing can vary, supporting some but not all of the capabilities such as OFDMA (which increases network efficiency and reduces latency in high-demand environments), MU-MIMO (which allows more data to be transferred simultaneously to/from a large number of concurrent clients), 1024-QAM (which increases throughput in Wi-Fi devices by encoding more data in the same amount of spectrum); and target wake time (designed to improve battery life in Wi-Fi devices such as Internet of Things (IoT) devices).
In combination with reliable high speed, WAN links, Wi-Fi 6 will improve application performance and the ability to deploy IoT systems. It is only a matter of time before every enterprise moves to the new standard, but getting that timing right spells the difference between a wise investment and wasting money on sub-optimal infrastructure.
Edge computing is about to bloom
This calendar year will be the one that propels edge computing into the enterprise-technology limelight for good, according to a set of predictions from Forrester Research. While edge computing is primarily an IoT-related phenomenon that addresses the need for on-demand compute, real-time app engagements will also play a role in driving the growth of edge computing in 2020.
What it all boils down to, in some ways, is that form factors will shift sharply away from traditional rack, blade, or tower servers in the coming year, depending on where the edge technology is deployed. It will also mean that telecom companies will begin to feature a lot more heavily in the cloud and distributed-computing markets. CDNs and colocation vendors could become juicy acquisition targets for big telecom, which missed the boat on cloud computing to a certain extent, and is eager to be a bigger part of the edge. They are also investing in open-source projects like Akraino, an edge software stack designed to support carrier availability.
But the biggest carrier impact on edge computing in 2020 will undoubtedly be the growing availability of 5G network coverage, Forrester says. While that availability will still mostly be confined to major cities, that should be enough to prompt reconsideration of edge strategies by businesses that want to take advantage of capabilities like smart, real-time video processing, 3D mapping for worker productivity, and use cases involving autonomous robots or drones.
Beyond the carriers, there is a huge range of players in edge computing, all of whom have their eyes firmly on the future. Operational-device makers in every field from medicine to utilities to heavy industry will need custom-edge devices for connectivity and control; huge cloud vendors will look to consolidate their hold over that end of the market and AI/ML startups will look to enable brand-new levels of insight and functionality.
What is more, the average edge-computing implementation will often use many of them at the same time, according to Forrester, which noted that integrators who can pull products and services from many different vendors into a single system will be highly sought-after in the coming year. Multivendor solutions are likely to be much more popular than single-vendor, in large part because few individual companies have products that address all parts of the edge and IoT stacks.
Edge Security. With the rise of edge computing has come a need for edge security. Edge security is not just about securing edge computing though; it is also potentially a new approach to defining user and enterprise security in the cloud-connected world. There are several aspects involved in edge security, including perimeter security, application security, threat detection, vulnerability management, and patching cycles.
In 2019, a new term was coined by Gartner to define a category of hardware and services that help enable edge security – Secure Access Service Edge (SASE). SASE is an emerging offering, combining comprehensive WAN capabilities with comprehensive network-security functions, such as secure web gateways (SWG), CASB, firewalls as a service (FWaaS), and zero-trust network access (ZTNA), to support the dynamic secure access needs of digital enterprises. By 2024, at least 40 percent of enterprises will have explicit strategies to adopt SASE, up from less than 1 percent at year-end 2018.
Openness and disaggregation
Networking has long relied on proprietary architecture, particularly on the transport layer. The thinking behind this is that while single-vendor solutions tend to be more expensive, they are also more reliable and easier to manage than a collection of interoperable boxes. But this attitude is quickly coming to an end as new technologies and new data-networking requirements fuel demand for open and disaggregated solutions.
Even traditional server and storage environments, which have long been plagued by discrete, largely non-interoperable platforms, are becoming integrated under virtualized, software-driven architectures, says Arthur Cole, an analyst and senior author.
A key initiative in this drive is the Open and Disaggregated Transport Network (ODTN) project by the Open Networking Foundation. The basic idea is to create a common transponder format so that operators are no longer required to use matching transponders from a single vendor, and can even pair these on an open-line system by a third vendor. Using an open network operating system (ONOS) SDN controller, ODTN will be able to automatically discover all disaggregated components on the network, providing a truly vendor-neutral management framework through industry-standard solutions like Transport API (T-API) and OpenConfig.
The complexity of the data environment is increasing exponentially these days. To keep pace, the enterprise needs unfettered access to the latest networking solutions, and this can only be accomplished by shedding the proprietary, single-vendor mindset. With open, disaggregated architectures, providers can finally define their networks according to what the market requires, not what their vendor offers.
Multi-cloud is the new normal for enterprises – a reality that presents a number of challenges for IT teams trying to get a handle on their disparate cloud resources. By 2021, more than 90 percent of enterprises will rely on multi-cloud to meet their evolving IT needs, according to IDC. This shift is driven by the evolution of the on-premises data center and the increasing sophistication of cloud and edge computing. Multi-cloud adoption is also being pushed by developers and data analysts in line-of-business teams that move to the cloud for service functionality, analytics capabilities, geographic distribution, and more, according to Mary Johnston Turner, research vice president at IDC.
The term multi-cloud has been around for years but still lacks a single definition. IDC takes an expansive view on the term, counting any mix of IaaS, PaaS, SaaS, private cloud, or on-premises dedicated hardware with cloud attributes as a multi-cloud deployment. But those pieces are rarely interconnected, which creates a problem for administrators. With so many moving parts, it can be an intimidating task to adopt a multi-cloud architecture – if the organization is not well-prepared for the move.
Enterprises’ next step in multi-cloud adoption. Multi-cloud architectures add complexity. Each platform has its own set of rules for operations and management, which makes it harder to cross-train staff and give IT teams a holistic view of a company’s cloud and on-premises assets.
The IT environment for organizations in India is becoming more complex with multiple public clouds, private clouds, and t raditional systems needing to be interconnected, integrated, and collectively managed. Most Indian organizations are clustered around early stages of cloud maturity, and find it challenging to move ahead in the adoption curve. As Indian organizations move toward a cloud-first strategy, enterprises are preferring cloud solutions for their new capability, capacity, and functionality,” says Rishu Sharma, associate research manager, Cloud and AI, IDC India.
IDC has made ten predictions impacting the technology buyers and suppliers in cloud in India over the next 48 months:
- Mega-platforms. By 2023, the top 4 clouds (mega-platforms) in India will be the destination of choice for 50 percent of workloads, while lock-in will be avoided through multi-cloud and cloud-native approaches to achieve portability.
- Cloud native. By 2022, 40 percent of new enterprise applications in India will be developed cloud-native, based on a hyper-agile architecture, but only 10 percent of those environments will have machine learning capabilities built in.
- Multi-cloud management. By 2023, 55 percent of India’s 500 leading organizations will have a multi-cloud management strategy that includes integrated tools across public and private clouds.
- Consumption-based deployment model. By 2021, more than 30 percent of enterprise IT operations spend in India will be consumption-based, opting for a public cloud platform as a lower-risk option to manage complexity and aligning cost to consumption.
- Redefining the edge. By 2023, more than 30 percent of organizations’ cloud deployments in India will include edge computing to address bandwidth bottlenecks, reduce latency, and process data for decision support in real time.
- SaaS and cloud verticalization. By 2022, organizations in India will spend more on vertical SaaS applications, excluding desktop and internal employee-productivity apps, than horizontally designed applications.
- Private cloud expansion. By 2020, 60 percent of enterprises in India using public cloud will also use an enterprise private-cloud platform; majority of these platforms will support delivery of higher-layer PaaS and SaaS functionalities.
- Data eruption. By 2023, 40 percent of India’s 500 leading enterprises will be AI-enabled, with over 40 percent of enterprise application workflows aided by AI to better utilize legacy data, real-time operational data, and third-party data feeds.
- Managed cloud services. Enterprises’ needs in India to optimize RoI, reduce budgets, and cope with the scarcity of cloud experts drive spending on managed cloud services to nearly USD 1.2 billion by 2022 and almost 25 percent of technology outsourcing.
- 10 Life-cycle automation. By 2022, 30 percent of organizations in India will have invested in automation, orchestration, and development life-cycle management of cloud-native applications to realize the cost benefits and operational efficiencies.
Data encryption matters in the cloud. While the cloud in many ways offers better security than the traditional on-premises data center, this should not obscure the fact that complex architectures tend to have more vulnerabilities than simple ones, and the cloud is nothing if not complex.
One of the key challenges is securing data as it navigates between the cloud and the user. Whether this journey runs across town or halfway around the world, data is likely to transition between multiple network providers and other handlers, even if the entire wide area infrastructure is harnessed under a single WAN or SD-WAN solution. Not only does this introduce gaps in security that can be exploited, it also often leads to the deployment of multiple encryption and other security mechanisms, all of which act to hamper performance and diminish the kind of visibility the enterprise needs to manage traffic flows.
This is why many organizations are turning to network-level encryption for their entire cloud ecosystem. By roping all data communications under a single solution, organizations are finding that they can quickly fulfil the requirements of emerging regulatory regimes like GDPR and PCI DSS across their distributed data footprints, while at the same time cut down on the management headaches of having to oversee countless service- or provider-based solutions.
According to MarketsandMarkets, the network encryption market is set to expand from USD 29 billion today to USD 4.6 billion by 2023, a compound annual growth rate of 9.8 percent. This is, in fact, one of the few areas of the IT stack that is expected to be dominated by hardware rather than software in the coming years. With a solid hardware foundation, network security benefits from top performance in high-speed, low-latency environments, and a single platform can provide robust security across all endpoints, networks, and applications.
Cloud security. Along with all the benefits that the cloud offers, it comes with security concerns. Because cloud is a different way of delivering IT resources, cloud security encompasses the same security concerns as on-premises IT plus others, unique to the cloud. Among the areas addressed by cloud security products are access control, workload security, privacy and compliance, and more.
When considering cloud security products, it is important to recognize and understand the different categories of solutions that are available to help organizations reduce risk and improve security. Among them are cloud access security brokers, software-defined compute (SDC) security, cloud workload-protection platforms, and SaaS security. And each of the major public cloud providers (Amazon Web Services, Google Cloud Platform, and Microsoft Azure) also have their own native cloud security controls and services that organizations can enable.
2020 is shaping up to be a busy one for enterprise IT network teams. Other than SD-WAN implementations, which is well on its way to becoming the default WAN connectivity option for enterprises and Wi-Fi 6 upgrades, the past few years have been a bit slow in terms of implementing new and innovative technologies. This year, many network teams are poised to be implementing advanced platforms, tools, and methodologies that bring aging networks into a new era.
From a high-level perspective, network teams will be tasked with creating networks that support emerging services, create time-saving efficiencies, and extend the reach of the corporate network footprint. Digitization of previously manual processes will be driving the need for faster network speeds and new services. Many of these technologies will be implemented into the LAN/WAN, while others are more cloud- and service provider-oriented. Regardless, these technologies can enhance visibility, speed of deployment, and ultimately drive business productivity well into the next decade. Some networking technologies and architecture trends that the new year is ushering in may be shortlisted as network automation, 5G for branch office connectivity, IoT network segmentation and monitoring, simplification of the internet edge, network analytics, managing consistent policy across hybrid and multi-cloud network, and edge computing changes that are possible.
Private 5G networks – Enterprise untethered
Deloitte Insights, in its report, TMT predictions 2020, asserts that 5G’s new standards for enterprise will open the floodgates to a host of previously infeasible applications, allowing for industrial-scale Internet of Things networks in factories, warehouses, ports, and more.
To enable enterprise connectivity – and not just any connectivity, but ultra-reliable, high-speed, low-latency, power-efficient, high-density wireless connectivity – a company likely has two basic options. It can connect to a public 5G network. Or it can opt for a private 5G network, either by purchasing its own infrastructure while contracting for operational support from a mobile operator, or by building and maintaining its own 5G network, using its own spectrum. For many of the world’s largest businesses, private 5G will likely become the preferred choice, especially for industrial environments such as manufacturing plants, logistics centers, and ports.
Deloitte expects that more than 100 companies worldwide will have begun testing private 5G deployments by the end of 2020, collectively investing a few hundred million dollars in labor and equipment. In subsequent years, spend on private 5G installations, which may be single-site or spread across multiple locations, will climb sharply. By 2024, the value of cellular mobile equipment and services for use in private networks will likely add up to tens of billions of dollars annually. And, although not all enterprise 5G networks will be private, many organizations will have good reasons to want them to be. Unlike a public network, a private 5G network can be configured to a location’s specific needs, and configurations can vary by site, depending on the type of work undertaken in each venue. A private network also allows companies to determine the network’s deployment timetable and coverage quality. The network may be installed and maintained by onsite personnel, enabling faster responses to issues. Security can be higher, affording network owners a degree of control that may not be possible on a public network.
As capable as wires, but without the wires. Private 5G for enterprises will exploit new capabilities, available in the next phase of the 5G standard, known as 3GPP Release 16. Release 16 aims to enable 5G to substitute for private wired Ethernet, Wi-Fi, and LTE networks, and includes multiple capabilities designed specifically for industrial environments. The various 5G networks that launched commercially in 2019 were based on Release 15; a Release 17 that will focus on additional applications, such as 5G broadcast, is also planned for the mid-2020s.
5G is not the only option for getting online, of course. In the short term (through about 2023), 5G will likely coexist with many other cellular mobile and Wi-Fi standards, as well as wired standards that are widespread today. In fact, in the medium term (through 2026 or so), most companies will likely deploy 5G in combination with existing connectivity, including wired Ethernet networks. However, in the long term – over the next 10 to 15 years – 5G may become the standard of choice in demanding environments, when flexibility is paramount, reliability is mandatory, or for installations that require massive sensor density.
The hotbeds of private 5G implementation. Thanks to the specifications in Release 16, 5G has the potential to become the world’s predominant LAN and WAN technology over the next 10 to 20 years, especially in greenfield builds. Those building a new factory, port, or campus may significantly reduce their usage of wired connections. The next 5 years will likely see a boom in private 5G implementations at locations that would greatly benefit from better wireless technology – in terms of speed, capacity, latency, and more – right now.
Deloitte predicts that about a third of the 2020–2025 private 5G market, measured in dollars of spend, will come from ports, airports, and similar logistics hubs, the first movers. It is not hard to see why. A major seaport (for instance) has some fixed machinery and equipment that can connect to networks over cables, but it also needs to track and communicate with hundreds of forklifts and dollies – not to mention hundreds or thousands of employees – in a controlled, sensitive, and secure environment. Further, port managers need to track multiple data points for thousands or tens of thousands of containers – exactly where each container is, whether it has cleared customs, whether it is at the right temperature, whether anyone has moved or opened it, whether anything has been removed or added, and so on. Ideally, every single high-value object in every single container could be tracked – potentially a million objects. And all this must be done in an area only about one-kilometer square, filled with moving metal objects and radiofrequency-emitting devices.
Another third of the total private 5G opportunity will come from factories and warehouses. Today, these facilities operate with a mix of wired and wireless technologies, but many companies are adopting new equipment that they expect to transform their business – but that would not work with wires.
The final third of the private 5G market will consist of greenfield installations, especially on campuses. In fact, many companies may initially choose to deploy 5G only for greenfield sites, creating islands of private 5G adoption among a heterogenous mix of connectivity technologies at legacy sites.
Companies can take multiple approaches to deploying a private 5G network. The very large companies are likely to install private 5G networks using fully owned infrastructure and dedicated spectrum (in markets where this is permitted), managing these networks either through an in-house team or via an outsourced mobile operator. Medium-sized and smaller companies are more likely to lease network equipment, outsource network management, and sublease spectrum (geofenced to their location) from a public mobile operator – or, in some cases, use unlicensed spectrum. A mobile operator, systems integrator, or equipment vendor may manage the network and all of its attached elements.
Hundreds and thousands of companies are likely to deploy private cellular networks over the next decade. Some may simply swap some or all their cables for wireless, but potentially much more rewarding – though more challenging – would be to pair private 5G deployment with process change and business model redesign. As more and more companies undertake transformations on the back of 5G, the shape of industry itself will alter, perhaps dramatically. If and when that happens, history will likely view 5G not just as a technological marvel, but as an elemental force that reshaped the way companies do business.
The world is a global village
The yet-unnamed coronavirus that originated in Wuhan, China, has laid bare an unpleasant fact about globalization – a viral infection that emerges anywhere can spread everywhere. To stop such an outbreak, the best remedy is radical, sudden, and temporary deglobalization. This containment deglobalization is really clobbering the tech industry. It is going to get a lot worse before it gets better.
The most immediate impact is technology trade shows. The largest mobile trade show, Mobile World Congress had to cancel its event. Dozens of Chinese trade shows have been cancelled outright. Our very own, Convergence India, had to be postponed too.
The next effect will be sales and manufacturing. Foxconn, where iPhones are made, is closed indefinitely – at least for making electronics. For now, they have switched to making face masks. Sellers on Amazon are bracing for product shortages resulting from factories closed because of the coronavirus. Companies like Google, Amazon, Facebook, and Microsoft have shut down offices in China or curtailed travel to China.
And, the third effect will be earnings. Most Chinese tech companies are expected to report earnings far below expectations.
Globalization is out and deglobalization is in, maybe. Deglobalization seems to be largely a tech issue: Technology companies appear to be either driving the deglobalization trend, or they are the instrument by which national governments are effecting change. But, is deglobalization really a thing? Maybe the world is not really deglobalizing. But something is happening. The world is getting more complicated. As large organizations make plans, it is important to keep in mind this growing complexity. In a nutshell, we are facing a future of both increasing globalization, but also increasing local requirements and contexts, says Mike Elgan, in his weekly column in IDG’s Insider Pro.
That means planners and buyers need to step up their game about local knowledge in dozens of countries around the world. It is important to become obsessed with flexibility, adaptation, local customization, and diversification.
And the coronavirus? Well, there is a good news and a bad news. The good news is that science is getting faster at developing vaccines. The bad news is that this kind of outbreak will happen again and again.
Viruses happen. Politics happens. But business also needs to happen. Learn from these trends that create the illusion of deglobalization and adapt. Your organization and your career depend on it.