The following is a summary of my predictions of the ICT trends for 2020. They have been selected because of their impact on the networking industry, and they forecast what is expected to happen or start happening, within the next 12 months. This information incorporates input and insights from several sources, available in a supporting document. You can also download the trends here

Cisco Top trends for 2020 - Kevin Bloch, CTO

“A machine with basic reading capabilities will be able to read everything the human race has
ever written by lunchtime, and then it will be looking around for something else to do.”

– Stuart Russell, Human Compatible

Welcome to the cognitive era. Compute costs will continue to head towards zero. Data volumes will continue to skyrocket. Artificial Intelligence (AI) will shift from research labs into business operations. The network will leverage, connect and combine these to enable massive transformation at machine-scale. On the one hand we have ‘digital abundance’, in which information and knowledge in the form of bits can be transferred and reproduced perfectly and almost infinitely for negligible cost. On the other hand, it has never been more important to appreciate that our global physical resources (‘atoms’) are finite, that our privacy and trust is sacrosanct and that government and regulators are beginning to take unprecedented action against global digital giants to protect citizens and rebuild trust. It has never been more important to harness the exponential power of these new technologies in a manner that is ethical, provides consistency for business and which is socially responsible – for our personal well-being, the planet and for the long-term benefit of its inhabitants.

1.   Artificial Intelligence – serving humans at scale

Many believe AI to be the most important technology of our lifetime. Until recently, AI was accessible only by a select few, requiring intensive research, compute power and significant funding. We are now entering an era of commoditised and democratised narrow focused AI, more widely available to a much broader market. A wide range of tools already exist to enable the sophisticated development of AI and Machine Learning (ML) rapidly and cost-effectively. In 2020, leading organisations will aim to scale AI in order to enhance and differentiate their business. Specifically, we will see deployment of cognitive technology in sectors such as oil and gas (autonomous rigs), construction and manufacturing (digital twins), retail (behaviour) and healthcare (multiomics*). While data quality and use is and will continue to be important, this pending proliferation of AI will require a shift in focus from data inputs to the quality and outcomes of AI outputs. Given that AI is now entering its next phase of deeper and wider integration into business and society, with significant and potentially ominous implications, oversight and governance in the form of industry guidelines, organisational ethical frameworks and possibly regulation, will become increasingly common.

* Multiomics is a new approach in which the data sets of different omic groups are combined during analysis. The different omic strategies employed during multiomics are genome, proteome, transcriptome, epigenome, and microbiome.

2.   Network transformation – building the cognitive internet for the future

As the foundation for the connected intelligence economy, networks will need to be rebuilt to support massive demand. Existing approaches to network designs are becoming economically and operationally unsustainable as the number of connections passes 30 billion and global IP traffic flows double to 396 Exabytes per month from 2019 to 2022 (when 5G is just getting started!). 400G ethernet deployment will commence at scale in data centres. Wi-Fi 6 will be introduced as the primary wireless off-load technology within buildings and 5G will start to become the mobile access of choice in the enterprise. While fixed connections continue to grow, wireless connections are growing faster. New silicon optical technology will become commercially available and provide the underlay for both fixed and mobile networks. Cognitive networks or ‘Intent-Based Networking’ (IBN) has now proven that it can address critical issues of complexity, scale and automation. Having taken over in the data centre a few years ago, in the wide area network, it took off in 2019 via SD-WAN. In 2020, IBN will reach into the access network and span across public and private cloud enabling secure, end-to-end policy-based management, scale and simplification.

3.   5G – the best is yet to come

While some may argue that 5G is one of the most-overhyped technologies, don’t be misled, 5G will happen. Preliminary estimates suggest the shift from 4G LTE to 5G NR (New Radio) is roughly two to three years faster than the 3G to 4G migration. 5G also promises several substantial technical advantages. Despite a stream of announcements from the industry, 2021, not 2020, will likely be the year 5G uptake starts to scale. The 3GPP standards that will underpin all 5G NR technology are expected to be finalised in early 2020. Apple’s expected support in late 2020 and more widely available spectrum (mmWave) will trigger interest, demand and investment post 2020. Fixed-Wireless Access (FWA) will initially have the most solid business case, impacting demand for fixed broadband (wired) access. Hopefully more 5G-dependent business cases will emerge, but don’t disregard existing, proven alternatives such as 4G (LTE), Wifi 6 and LoRaWAN. Ultimately, for everyone except the industry itself, 5G is less about what it is and all about the services it can enable.

4.   Cloud and edge – multi-cloud, hybrid-cloud and edge continuum

Over the past decade, the cloud has transformed IT and enabled every sector currently undergoing digital transformation. Now cloud itself is being transformed by technological advances, in particular in AI and by the need to harness the value and volume of data. Cloud providers will compete aggressively to provide efficient on-demand cognitive services and data sets. Workloads will execute on bare-metal (some for AI/ML-specific algorithms), or server less or soon, quantum computing. This will introduce a new set of challenges and opportunities and open up a new battleground for cloud providers who aim to own and secure the customer for both intra- and inter- cloud. It is also the end of cloud as we know it, as the number of applications (intelligence) at the edge will increase 800% by 2024 driven by increasing volumes of data, AI, 5G, low-latency use-cases and real time streaming services.

5.   Economic dislocation – the beginning of the end of permission-less innovation

Governments and regulators are taking a stand. Tangible evidence of this includes several multi-billion dollar fines for data misuse recently received by some global tech companies and a significant overhaul of various regulatory regimes earmarked for 2020. Companies should be treated for what they really are (i.e. advertising, media or taxi companies), rather than masquerading as technology companies. We can expect particular regulator scrutiny and government action to address data breach and misuse, political influence, tax and regulatory avoidance, algorithmic accountability and anti-competitive conduct. While this may look like a limited ‘government-versus-big tech’ battleground, businesses in every sector will need to deploy adequate systems and governance arrangements and to monitor and approach compliance comprehensively in order to remain ready to respond and take advantage of the promises the digital age presents.

6.   Security transformation – focus on workforce, workplace and workload

It’s getting worse! Many organisations need to defend thousands of cyber attacks per minute while an attacker need only break through once to cause serious damage. It’s not just the volume of attacks that is increasing, so too is the sophistication and ingenuity. Some attacks are being concealed within other attacks, RAT (Remote Access Trojans) infestation is serious, hackers are leveraging AI/ML, lasers are used to control microphones in intelligent speakers and ransomware is up 500 percent. Cybersecurity is inherently a human-driven phenomenon of people wanting to do harm, steal money, cause damage, hurt businesses, or governments. Boards and senior management can expect more intensive scrutiny and government action on non-conformance. In order to improve risk management, organisations will need to find the right balance of human and technology capabilities. AI, cloud, identity management, the network and zero trust will become integral cyber defence tools. However, each independently is insufficient. An architectural, systems approach, coupled with an organisation-wide cultural awareness of cyber risks, is essential.

7.   Corporate social responsibility – a new form of capitalism: purpose beyond profit

Enter a new form of capitalism that is more socially responsible. Some of these responsibilities will no longer be optional. For example, some governments will begin to take action on businesses who fail to act on climate risk. Governments will come under further pressure too. Corporate executives representing millions of workers globally have expressed support for the Paris agreement in order to avert catastrophic temperature rises. They argue that the battle against climate change would protect their nation’s economy and create jobs and businesses. If ignored, estimates suggest that the economic impact could be in the hundreds of billions of dollars, if not trillions. There are plenty of issues to deal with including the climate, carbon emission, waste (the circular economy), trust, diversity, equality and the social divide. Those that embrace social responsibility will enhance their brand and attract talent. In 2020 this issue will become more critical. There has never been a time in history that the planet has needed technology like it does today to help save itself and its inhabitants.

8.   Hyper-personalised healthcare – medical science and IT converge

Two revolutionary tools are overhauling healthcare as we know it today: genome sequencing and CRISPR* engineering. Sequencing the first human genome took about 13 years of research and over US $2.7 billion to complete. Today, it’s on its way to becoming a $100, one-hour-long process. The convergence of medical science, biotechnology and digital technologies is showing significant potential to improve well-being and reduce costs. As more patients are connected and more data is collected in real-time, systems will learn and become more intelligent, enabling even deeper insights and predictive analysis within and across communities. The possibilities include eliminating many diseases, hereditary conditions and disabilities but concurrently opens many deeper ethical questions that will need to be addressed.

*Clustered Regularly Interspaced Short Palindromic Repeats