The first industrial revolution of the mid-eighteenth century was sparked by various critical enablers coming together:

  • steamships and railways brought raw materials to factories;
  • reliable power sources became available as windmills and waterwheels advanced into steam-powered engines; and
  • advances in the production of steel and improved manufacturing techniques enabled the development of more efficient specialised tools.

Similarly, the technological enablers for the rapid advance of artificial intelligence (AI) have been falling into place in recent years – something we explored in our recent report: AI: Understanding and harnessing the potential. However, a critical advance has been the speed of Graphical Processing Units (GPUs). Originally designed to perform millions of calculations in parallel for video games, their architectures are well suited to machine learning.

The chart below shows how in just five years these have seen an almost ten-fold increase in processing power:

Source: Our presentation of data from Nvidia on their GTX range

AI really took off a few years ago, when hardware well-suited to its computationally-intensive nature became available. In the future, further advances in hardware will enable even greater acceleration in the capabilities of AI:

5 ingredients for responsible AI.

Neural processors

Neural or AI-specific processors have been gaining interest, with recent announcements from many key players such as Google, Huawei, IBM, Intel, NVIDIA, Samsung and Tesla.

Bespoke neural chips will continue to drive gains in processing performance and power efficiency through designs that are better optimised for neural networks and other machine learning tasks. These developments will support advances both in training AI systems and also deploying AI processes in user devices.

For example, the type of neural processor embedded in Apple’s latest iPhone enables more voice and image recognition tasks to be carried out locally. This type of system architecture reduces latency and power consumption, reduces reliance on connectivity and offers benefits for privacy and security, since data does not need to leave the device.

Quantum computing

The impact that quantum computing will have when combined with AI is only just beginning to be explored, but breakthroughs in as little as four to five years are predicted by some. Investments by Google, Intel, Microsoft and others will advance research and ultimately produce a practical quantum machine. Although not all tasks are suited to quantum computing, factoring large numbers to crack many of today’s encryption techniques and executing machine learning algorithms are expected to be particularly suitable.

As the huge growth in real applications of AI continues, there will be increasing demand for even faster, more powerful AI systems running on end-devices such as smartphones as well as powerful servers. The boom in the processing power of GPUs has been a key enabler of this recent growth, but this will accelerate yet faster as neural processors and eventually quantum computing become a reality.

Author
Tim Winchcomb
Technology Strategy Group Leader

Tim is a senior technology strategy consultant in our Wireless and Digital Services division, with more than 15 years’ experience in the high-tech sector, ranging from product development to commercial strategy.