Connect with us

AI

Monetising AI

2018 saw adoption of AI, albeit in fits and starts by businesses. 2019 is expected to see the acceleration in adoption and monetization of this technology, the world over.

Intelligent networking is already making its way into the enterprise, forever changing the ways in which both traffic and resources are managed. 2018 was a definitive year as the use of AI was dominated by big tech companies like IBM, Google, Microsoft, Amazon, and a few smaller tech-based start-ups are leading the way in the early adoption of these and other technologies. A key breakthrough has been Cloud-based AI which has made the technology cheaper and easier as we came to the close of 2018.

As we head into 2019, with AI increasingly moving from the lab to offices, factories, hospitals, construction sites, and consumers’ lives, leading companies are starting to move their AI models into production, where they will run operations to enhance decision-making and provide forward-looking intelligence to people in every function. The approach is being formalized and company-wide capabilities developed so successful (and smaller) projects can be replicated and built into a greater whole.

There’s good news in the Indian context too. India is uniquely positioned to succeed in the adoption of Artificial Intelligence. AI has the potential to amplify India’s demographic advantages, by dramatically expanding human capabilities. For individual businesses, cognitive computing and AI readily motivate significant efficiencies and growth opportunities. So 2019 looks to be a promising year for India which is at the forefront of the Artificial Intelligence (AI) surge. We will see more local innovations from the country with broader adoptions across various sectors globally as expressed by Sriram Raghavan, Vice President, IBM Research & CTO IBM India/South Asia.

The total market for AI-driven networking solutions is expected to hit Rs 39,500 crore by 2023. In fact, by that time, more than half of the total AI spend will go toward the network. Much of this will be linked to the deployment of software-defined networking (SDN), as well as edge computing, the IoT and emerging 5G topologies on the mobile side. Ultimately, the rudimentary intelligence will lead to self-organizing networks (SON) and cognitive network management solutions capable of supporting autonomous decision-making across wide swaths of network infrastructure.

AI and IoT. A key question at this point is how to bring artificial intelligence from the cloud to the edge. Only in the past few years have compute and storage infrastructure evolved to the point where AI can be supported outside of massive systems at universities and scientific research organizations. Pushing this level of number-crunching to the IoT edge will require something along the lines of a blockchain-like distributed computing platform that harnesses the power of legions of idle devices. As well, we will likely see a strong push for AI coprocessors throughout the IoT data chain, and possibly a new generation of AI algorithms that dramatically lessen the data load.

Microsoft, for instance, recently announced Project Brainwave that seeks to deploy the Intel Stratix 10 FPGA on data center networks in support of real-time AI processing. The idea is to enable deep neural network (DNN) microservices on the hardware level, which removes much of the processing overhead from CPUs in the server. This soft DNN approach is expected to exceed the performance of hard-coded DPUs because the FPGA is able to process AI requests as fast as the network can stream them.

An even better approach, however, is to implement AI on the IoT device itself. Qualcomm recently acquired a Dutch company called Scyfer BV, which specializes in AI platforms for a range of industry verticals. Qualcomm is hoping to leverage Scyfer’s expertise to implement AI on smartphones, smart cars, robotics, and other endpoints in order to provide AI functionality without networking or even a WiFi connection. By pushing AI to the device, Qualcomm says it can provide immediate response to queries or changing conditions, as well as better privacy protection and more efficient use of network bandwidth.

Networking is the heart of virtually all data architectures since it is the exchange of information that drives real value in most digital processes. But the intelligent IoT is unlike any networking challenge that has come before, both in scale and complexity. The technologies under development now are a good start, but it’s going to take a lot more effort and ingenuity to realize the all-encompassing digital ecosystem that today’s IoT platform providers are promising for the future.

AI Adoption

According to PricewaterhouseCoopers only 20 percent of companies plan to deploy AI enterprise-wide in 2019. In a BCG/MIT survey, of the 60 percent  of respondents who viewed AI strategy as urgent for their organizations, only half already had one in place. In a 2017 Deloitte survey, only 17 percent of executives were familiar with both the concept of AI and its applications at their companies. Yet 73 percent of C-level executives agree or strongly agree that AI has already transformed the way they do business, according to an Infosys report.

Business leaders must strive to introduce AI in ways that are effective and sustainable. It is imperative for forward-thinking leaders to invest time, energy, and resources into learning about the implications of AI for their industries and their organizations.

AI will have broad, industry-agnostic consequences for business operations across industries — one of which will likely be increased internal collaboration. Immense amounts of data are required for AI to function, and in many companies, that data is currently fragmented. As AI becomes commonplace, data structures will need to change to fully realize the benefits of AI approaches in daily business practices.

A 2018 PwC report stated that some forward-thinking organizations are already breaking down the silos that separate data into cartels and employees into isolated units, and warned that failure may await those who don’t. The BCG/MIT report, too, affirmed that a hybrid model emphasizing cross-functional collaboration may make the most sense for businesses integrating AI.

One department where AI is already having a measurable impact is human resources. HR is a logical area for companies to introduce AI, as it can serve a myriad of functions: reducing human bias during initial screenings, sifting through resumes and social media, and helping with routine processes like training, onboarding, and evaluation. Of those HR/AI functions, recruiting might be the most utilized.

The most critical, and most difficult task of any business leader is making decisions. Although AI cannot yet replace a leader’s emotional intelligence and depth of organizational knowledge, it can certainly aid decisions with the use of data. While, unsurprisingly, much of the discussion surrounding AI-enhanced businesses focus on data and technology, these types of tech-forward models will only be possible when spearheaded by leaders with emotional intelligence, adaptability, and above all else, vision.

Vendor update

Today’s flurry of AI advances wouldn’t have been possible without the confluence of three factors that combined to create the right equation for AI growth: the rise of big data combined with the emergence of powerful graphics processing units (GPUs) for complex computations and the re-emergence of a decades-old AI computation model—deep learning.

A crowded and chaotic space, it has big brand name companies – like IBM, Apple, Google, Samsung, Microsoft, Amazon, and many others, vying for market share. There are also countless smaller and lesser-known companies in the space that are trying to be heard and seen.

The best companies with the best ideas will not make it if they do not have money. That means investors and funding. That can come from the investment community or from being acquired.

An IBM team spoke with 30 artificial intelligence visionaries to explore some of the catalysts for the next wave of AI advances. They identified a new equation for future AI innovation, one in which old variables get a makeover and new variables spur great leaps. Prepare for a formula that includes the advent of small data, more efficient deep learning models, deep reasoning, new AI hardware and progress toward unsupervised learning. AI  does not replace human thinking but augments it.

AI and the Network

It is obvious that AI will alter the networking landscape, in its ability to enhance traffic management, analytics and the provisioning of virtual resources. But AI will also have a pronounced impact on network infrastructure and in particular the fundamental way in which networking silicon processes packets and data streams. And these changes will happen not because of AI’s new capabilities but because AI workloads have vastly different networking requirements than traditional data.

According to Tractica, AI will likely produce a boom in hardware sales that will drive the market from today’s $3.5 billion to $115 billion by 2025. Initially, much of this activity will come in the form of acceleration hardware on the compute side of the data environment, but both storage and networking are likely to see major innovations going forward in order to optimize AI workloads end-to-end. It is also likely that hardware constructs will become more specialized for individual AI workloads, particularly as applications become more adept at provisioning their own resources.

Traditional data is structured and static, which is fine when offloading to discrete storage and processing pools for intense — and often time-consuming — analytics. AI is more free-flowing and subject to parallel processing technologies like Hadoop. It also requires real-time transit speeds and on-the-fly integration of multiple streams from widely distributed endpoints. As such, AI will require less point-to-point connectivity and more in the way of flexible, composable fabrics capable of auto-discovery and support for highly abstract architectures.

AI workloads are also likely to produce a rise in field programmable gate arrays (FPGAs) on the network and elsewhere. New adaptive compute acceleration program (ACAP) programs that combine networking, memory, and software development tools to address key workloads like AI, IoT and 5G are being introduced by vendors. The idea is to support a wide range of architectures under the same basic framework, utilizing the FPGA’s programming capabilities to optimize the environment after the hardware has been deployed. The platform also includes an embedded digital signal processor capable of supporting high-precision floating point workloads while still maintaining an efficient power envelope.

All of this adds up to a network that must be dramatically faster, more versatile and more adaptable than what we have today. More so than previous forms of computing, AI requires broad optimization across disparate resources. If the network cannot provide the connectivity sufficient to this challenge, the entire transition to a more intelligent, autonomous data ecosystem will grind to a halt.

“At the moment, much of the attention is on how AI will help the network. Going forward, we’ll have to start thinking harder about how the network can help AI,” says Arthur Cole, a consultant on enterprise networks.

AI and Data Centres

One of the initial applications for AI on the network is visibility. As traffic becomes more complex and data infrastructure becomes more distributed over wide area infrastructure, the need to gain deep packet-level visibility and real-time telemetry increases. But even as intelligence is changing the network, it is also altering the way in which data resources are provisioned and consumed, says Market Realist’s Paige Tanner. This is most pronounced in an increasingly intelligent Internet of Things, which is upping the reliance on the cloud and causing many providers to increase the speed and agility of their internal infrastructure. A case in point is Amazon, which is rapidly deploying NAND flash and other solutions to accommodate the needs of increasingly smart e-commerce applications. As well, cloud providers of all sizes are rolling out GPU-as-a-Service, on-server memory solutions and the latest high-speed processors from Intel, AMD and Qualcomm to handle the expected loads from autonomous cars, self-service kiosks and intelligent agents in the home and workplace.

Intelligent networking will give the enterprise a boost when it comes to supporting next-generation applications and services, but it still requires a guiding hand when applying data-driven analytics to working production environments. As with any automated system, an intelligent agent can magnify the damage of a process that is fundamentally flawed, particularly when it does not have access to all of the data needed to make an informed decision.

But as application performance becomes increasingly dependent upon the fine-tuning of a vast and complex network infrastructure, expect smart monitoring and management to quickly transition from a competitive advantage to a core necessity.

AI and Telecom

No longer limited to providing basic phone and internet service, the telecom industry is at the epicenter of technological growth, led by its mobile and broadband services in the IoT era. This growth is expected to continue, with Technavio predicting that the global telecom IoT market will post an impressive CAGR of more than 42 percent by 2020. The driver for this growth is obrioualy, Artificial Intelligence.

Today’s communications service providers face increasing customer demands for higher quality services and better customer experiences. Telecoms are addressing these opportunities by leveraging the vast amounts of data collected over the years from their massive customer base. This data is culled from devices, networks, mobile applications, geolocations, detailed customer profiles, services usage, and billing data.

Telecoms are harnessing the power of AI to process and analyze these huge volumes of Big Data in order to extract actionable insights to provide better customer experiences, improve operations, and increase revenue through new products and services.

With Gartner forecasting that 20.4 billion connected devices will be in use worldwide by 2020, more and more CSPs are jumping on the bandwagon, recognizing the value of artificial intelligence applications in the telecommunications industry.

Network optimization, predictive maintenance, virtual assistants and RPA are examples of use cases where AI has impacted the telecom industry, delivering an enhanced CX and added value for the enterprise overall. Technology is already a big part of the telecommunications industry, and as Big Data tools and applications become more available and sophisticated, AI can be expected to continue to grow in this space.

AI makes inroads in broadcasting

AI can bring huge potential benefits to broadcasters, with particular relevance in areas of the workflow that are labor-and time-intensive—like ingest. By enabling broadcasters to track how operations are being used across their organization, AI-based solutions can create more efficient operations and bring costs down by identifying trends. As broadcasters of all sizes are under pressure to produce more with lower budgets, AI-based solutions can help them focus their resources on creating more compelling content.

AI is being explored to test supply chain management enhancements before integrating them into a facility’s workflow. In a machine learning/artificial intelligence solution, the system could learn enough about the content types by watching content and could experiment with various combinations in an offline environment, until you have sufficient confidence that it is providing better management of the supply chain in real time than manual methods, optimizing for cost and quality at the level of each individual piece of content.

Another potential role is in monitoring. An AI-assisted multiviewer could provide more in-depth information about each signal, but also put that in context across all of the individual devices that make up that particular channel of content. Today we monitor by exception; Tomorrow that monitoring will be more predictive and seamless.

As promising as AI technology is, many operations touched by AI still need to have a human overseer to ensure smooth operations.

For broadcasters, it’s all about what you do with your bandwidth—the more effectively a broadcaster uses its broadcast bandwidth, the more profitable it can be. For this reason, AI products can now address bandwidth, including learning from one encoding session to improve the next.

AI plays an increasingly important role in video encoding too, where it can significantly help improve workflows. By continuously learning the parameters used in previous encodes, AI-optimized settings can be applied to every new video file. Furthermore, every asset that will be encoded with our service helps to train this machine-learning model and makes the prediction for future encodings more accurate. This results in faster processing times and significantly higher quality with no increase in bandwidth. And, the savings from properly configured AI-driven encoding are substantial.

Netflix, for instance, estimates that its use of AI to automate workflows and reduce customer churn saves the company around $1 billion annually. This not only increases the quality of experience and quality of service for users but also reduces the number of bits required to achieve the same quality stream. YouTube is also at the forefront of using AI to reduce overall video latency and encoding costs. Not only does a properly configured AI system process video as it is ingested, it can also dip deep into existing library files and process those.

AI functionality is also being integrated into Media Asset Management systems. It helps automatically recognize elements within audio and video, and generate associated metadata, making it easier to sort, locate and use content across all MAM workflows. Content owners then no longer have to rely only on a manual effort to tag and catalog assets, as this is a time-consuming and expensive process. However, AI capability in MAM systems is not completely hands-off from a human standpoint—yet.

All these functionalities require human review and quality control right now, but one of the key characteristics of AI and machine learning is the ability to learn and improve over time, so these functions will continue to evolve going forward.

Over the past five years, artificial intelligence has moved out of the laboratory and into real products—you only have to go as far as Apple’s Siri and Google’s Alexa to find examples in the real world. The idea of a computerized assistant has now become real to millions, and that increases the pressure for similar machine aids in various professions and industries… including broadcasting.

It’s clear that the most efficient use of bandwidth and the ability to quickly create targeted programming are of great interest to broadcasters, and artificial intelligence is helping to make that possible.

Where are we headed now?

Having said this, there is no harm in being the devil’s advocate.

All things come to an end, especially economic cycles. Business cycles, most economists have noticed, follow an eight to ten-year pattern, whereas IT cycles seem to have about a three and a half year period – meaning that you can generally get in at least two and perhaps three IT waves during a given economic cycle. These are also referred to as tech epochs. The first tech epoch in this cycle was the rise of mobile rising from the ashes of the 2008 meltdown. In 2010, if you were not in mobile phones or mobile apps, you were not making money. By 2012, IT recruiters were throwing the I has APP experience resumes in the trashcan, because, well, no one was making money in the mobile space anymore. About this time, you began hearing about BIG DATA. Java developers got a new lease on life as the rise of Hadoop suddenly made Java relevant again. In 2018, the two biggest remaining Hadoop players finally merged, because demand had collapsed and the market could no longer support an alpha and beta dominance play. This second epoch also saw the rise of Bitcoin and ICOs, with Bitcoin falling spectacularly from a high of USD 20,000 per Bitcoin to about USD 3900 today (amidst the complete collapse of most other ICOs).

This takes us to the most recent cycle, the Data Science / AI / Blockchain epoch. Big Data needed Big Data Scientists to analyze all that data because the average businessman just could not fit it cleanly into a spreadsheet. This led a lot of companies to spend a lot of money attempting to entice data analysts, where they sat for two years twiddling their thumbs because they realized that there really wasn’t much IN the data that these companies were throwing at them.

The areas that have seen the most dramatic growth have been in machine learning and deep learning, areas that fall into the general domain of artificial intelligence. AI as a rubric or category almost inevitably gets trotted out toward the end of an economic cycle – these technologies are more forward-looking, riskier in terms of speculation, and often may have a longer time frame before they are likely to bear results.

This is the next level of AI: the judgment of an AI will improve, quietly and unobtrusively, both due to the accumulation of ever larger amounts of learning material and to periodic upgrades in algorithms and back end capacities.

To wrap it up, AI into 2019 and beyond will begin to disappear from the front pages and instead simply become augmentative infrastructure – making it easier to locate things, easier to control things remotely, easier to analyze or create. AI can help us to model scenarios, to explore the potential consequences of taking or not taking certain actions, but the process of making the decisions to take those actions rests firmly within human hands. The next couple of decades will likely be contentious because of that!

Click to comment

You must be logged in to post a comment Login

Leave a Reply

Copyright © 2024 Communications Today

error: Content is protected !!