Five key technologies have been shortlisted, which the enterprise will explore in 2017. Adoption and integration of these systems with existing enterprise platforms may be held back for some time. However, there would finally be no option but to ensure that this becomes a fundamental part of the analytics fabric.
2017 is expected to be the year when there shall be mainstream adoption of smart energy and smart entertainment, and voice assistants will become a must-have in the home.
The Internet of Things (IoT) will become the main strategy for the market makers of the next decade. It will through strategic platform develop and build new smart buildings, VR, and AR platforms for gaming and digital assets management to always connected products and appliances.
But with laws on cyber surveillance, these edge sensor and analytics systems that run off the IoT data backplane will become a new Pandora’s Box for cyber risks and issues. There are not many commonly accepted standards for IoT devices, and vendors do not seem to work as hard to make connected devices secure as they do on more traditional endpoints, like laptops and smartphones. Even if a hacked IoT device does not represent much of a threat on its own, it is simple enough to incorporate it into a vast botnet, which is exactly what the attackers behind the Mirai botnet have been up to lately, exploiting DVRs, surveillance cameras, and other poorly secured IoT devices and making them into a zombie army able to hamstring Internet access. It is still unclear as to how many devices have been infected, but Flashpoint estimates that as many as five million devices are vulnerable in at least 10 other countries, apart from Germany, where it originated.
IoT security requires strengthened network access controls, including real-time application control and visibility, IoT-supported secure-authentication methods, granular device policy enforcement at the edge, and centralized reporting and monitoring tools.
One way or another, IoT will shake up computing in 2017 – either as a key underpinning of a host of new technologies, or the venue for further devastating cyber attacks.
Augmented Reality and Virtual Reality
If computing companies have their way, then 2017 will be the year in which virtual reality (VR) and augmented reality (AR) – two closely-related but very different technologies – become widely popular.
AR and VR are close cousins and rely on similar technology. But the two technologies have one fundamental difference. VR is immersive – the headsets must, by necessity, block out the external world. Putting one on is tricky enough to ensure that glancing away, as one might do when watching television, is not really possible. The first wave of applications, therefore, is in video games and films, where users will prove willing to lock themselves into their virtual worlds.
AR, by design, maintains its user’s connection with the real world, and that means that a headset is not necessary. Heads-up displays are an early example of AR, but there are others – VeinViewer, for instance, is a medical device that projects images of a patient’s veins onto his skin, to help doctors aim injections. Many existing smartphone apps also make use of AR. Nonetheless, the biggest AR product launched this year is indeed a headset, specifically Microsoft’s HoloLens. It aims to liberate computing from a fixed screen, overlaying its user’s view with useful additions (painting an email across a nearby wall, for instance, or putting weather information on a breakfast table). The firm must hope it does better than another famous AR headset, Google’s Glass, which, after years of development and months of public tinkering, was finally sent back to the drawing board last year.
According to IDC, worldwide revenues for the AR/VR market will grow from USD 5.2 billion in 2016 to more than USD 162 billion in 2020. This represents a compound annual growth rate (CAGR) of 181.3 percent over the 2015–2020 period.
“For many years, augmented and virtual reality were the stuff of science fiction. Now with powerful smartphones powering inexpensive VR headsets, the consumer market is primed for new paid and user-generated content-driven experiences. Recent developments in healthcare demonstrated the powerful impact augmented reality headsets can have at the industry level, and over the next five years we expect to see that promise become realized in other fields like education, logistics, and manufacturing,” stated Chris Chute, vice president, Customer Insights and Analysis, IDC.
“The rise of new, less expensive hardware will put virtual and augmented reality technology within the grasp of a growing numbers of companies and individuals. But, as always, what people can do with that hardware will depend upon the applications and services that power it. In the coming years, we expect developers to create a wide range of new experiences for these devices that will fundamentally change the way many of us do work,” said Tom Mainelli, vice president, Devices & AR/VR, IDC.
From a regional perspective, Asia-Pacific excluding Japan (APeJ), the United States, and Western Europe will account for three-quarters of worldwide AR/VR revenues. The three regions will generate comparable revenue amounts in 2017, but the United States is forecast to pull well ahead of the other two regions by 2020. Because AR/VR technology is still in the early stages of adoption, every region is expected to see annual growth of more than 100 percent throughout the five year period.
Other AR and VR predictions from IDC include:
- In 2017, retail industry spending on AR/VR hardware, software, and services will increase by 145 percent to more than USD 1 billion.
- Three out of 10 consumer-facing Fortune 5000 companies will experiment with AR or VR as part of their marketing efforts in 2017.
- By 2019, 10 percent of all web-based meetings will include an AR component, driving disruption of the USD 3 billion web conferencing market.
Triple A Protection
Information technology decision makers (ITDMs) are taking steps to protect the corporate network from threats of all sizes. However, as it stands, security is still at risk from internal and external stand point.
How can ITDMs know when they have reached a level of security that will protect from cyber attacks while still empowering employees to do their job better? A comprehensive security approach should encompass three factors – it should be adaptive to threats, business requirements, and also the ever-evolving use of the Internet within the corporate network; should have adapted to meet the specific requirements of an organization; and should have been adopted fully by end users. These factors can be summarized as a Triple A security approach.
Gartner lists assessment of the security situation:
- By 2020, 60 percent of digital businesses will suffer major service failures, due to the inability of IT security teams to manage digital risk.
- By 2020, 60 percent of enterprise information security budgets will be allocated for rapid detection and response approaches, which is an increase from less than 30 percent in 2016.
- By 2018, 25 percent of corporate data traffic will flow directly from mobile devices to the cloud, bypassing enterprise security controls.
- Through 2018, over 50 percent of IoT devices manufacturers will not be able to address threats because of weak authentication practices.
Gartner has outlined top six trends driving the need for adaptive, context-aware security infrastructures – mobilization, externalization and collaboration, virtualization, cloud computing, consumerization, and the industrialization of hackers. The premise of the argument for adaptive, context-aware security is that all security decisions will be based on information from multiple sources.
As these platforms become more sophisticated and trusted in 2017, they will be able to spot attacks in earlier stages and stop them before they become active breaches.
Blockchain for Smart Contracts
Enterprises will put blockchain hype to the test as they start exploring its ability to reduce transaction costs, streamline partner interactions, and accelerate business processes.
A blockchain – the technology underlying bitcoin and other cryptocurrencies – is a shared digital ledger, or a continually updated list of all transactions. This decentralized ledger keeps a record of each transaction that occurs across a fully distributed or peer-to-peer network, either public or private. A blockchain’s integrity hinges on strong cryptography that validates and chains together blocks of transactions, making it nearly impossible to tamper with any individual transaction record without being detected.
In the financial world, blockchains are expected to disrupt how financial institutions conduct payments and wire transfers, process securities trades, and handle compliance reporting, to name just a few use cases.
Outside of finance, industry watchers cite opportunities for blockchains to play a role in core business functions from supply chain and manufacturing to legal and healthcare. When there is an audit trail required – to track the provenance of finished goods, for example, or to document a real estate title – blockchain networks can be used to create verifiable, tamper-proof records in an encrypted format and without having a central authority.
Early adopters have launched hundreds of pilot projects, but there is a long way to go before blockchain hits mainstream adoption. Among the obstacles blockchain deployments face are technical challenges, lack of standards and governance models, shortage of skills, and scalability concerns.
As 2016 closes, vendors continue to devise distributed applications and platforms based on blockchain technology, and venture capital firms continue to pour money into the effort. More than USD 1.4 billion has been invested in blockchain technology over the past three years, according to an August report by the World Economic Forum (WEF). More than 90 corporations have joined blockchain development consortia, and more than 2500 patents have been filed. The WEF predicts that by 2017, 80 percent of banks will initiate projects that involve distributed ledger technology.
For enterprises interested in exploring how they can use blockchain and distributed ledgers, research firm Gartner recommends starting with limited-scope trials that are aimed at specific problems. Enterprises can start to investigate how distributed networks might improve business processes that are constrained by transaction inefficiency and how technology suppliers might be able to help.
Machine Learning, the Promise of Predicting the Future
Every company is now a data company, capable of using machine learning in the cloud to deploy intelligent apps thanks to three machine learning trends – data flywheels, the algorithm economy, and cloud-hosted intelligence. With hosted machine learning models, companies can now quickly analyze large, complex data, and deliver faster, more accurate insights without the high cost of deploying and maintaining machine learning systems.
With more data becoming available, and the cost to store it dropping, machine learning is starting to move to the cloud, where a scalable web service is an API call away. Data scientists no longer need to manage infrastructure or implement custom code. The systems scale for them, generating new models on the fly, and delivering faster, more accurate results.
It’s still early days for adoption, though. A recent study by Deloitte reveals that only 8 percent of enterprises use machine learning technology today. Allied Market Research predicts the industry is growing at a 33 percent CAGR and will reach USD 13.7 billion by 2020.
We have come a long way. But we still have a long way to go!