The hugely exciting – not to say revolutionary – promise of quantum computing comes with a significant challenge. The seismic technology shift that is already underway will render a chunk of our current cryptographic standards moot. Specifically, I’m referring to key establishment mechanisms and authentication methods – both of which have a sizable impact on our current security ecosystem.

Quantum computing – we must confront the threat to digital security

For me, the implication is clear. Now is the time to think about and design the mechanisms which will keep us and our products safe. So, where to start? In this article, I’m going to outline some thoughts and insights pertinent to anyone developing systems that need to be secure for the long term. My main theme is this: how to do you future-proof systems and embed post-quantum cryptography (PQC) into today’s devices?

Let's consider this question from the viewpoint of a product company. Say it produces a small device which it sells to the masses (This is just one view into the problem, by the way. The consequences will reach beyond just device designers and manufacturers). The device relies on networked communications to provide smart features. Data is moving to-and-fro, and it must be kept secure and private.

Right now, we secure the data with technology based on established security standards which we understand and can meet. But what happens when some of our assumptions underlying all those standards fall through? That is, when a new threat emerges for example? Can these changes be adapted retrospectively without a significant redesign or redeployment?

The complicated quantum computing threat

The first thought will be to patch in some fixes; to upgrade and be safe again. While this might be possible in some cases, it is more complicated with the quantum computing threat. PQC algorithms are fundamentally different to current standards. As a result of this, optimisations made for the device, to improve efficiency based on current algorithms, may not work with PQC algorithms. So, if the device will be around for 10 years or so, this will need to be addressed in the design now.

Here at Cambridge Consultants, we have been testing whether it is possible to build PQC algorithms into IoT devices now. To do this, we have been pushing the boundaries of the ability of today’s hardware to run the PQC algorithms needed in the future. It has been a challenge and a lot has been learnt from the work. Whilst we’ve shown that it is possible to future-proof devices at a reasonable cost, you need to consider a number of important factors:

  • The right chip must be chosen to ensure there is enough memory and storage to operate the new algorithms
  • It’s important to update firmware securely. New digital signature algorithms will be more computationally expensive, so the device must be able to handle things like having enough storage for the signatures or any other metadata associated with the operation
  • Cryptographic agility must be considered. The flexibility to easily switch underlying crypto is important when assumptions on the security of algorithms fall through. This is especially relevant for custom low cost/bulk manufacture devices, as web-based standards such as the X.509 public key certificate standard already include provisions for switching out the underlying encryption algorithms used

The use case matters, with PQC mainly affecting asymmetric crypto. Some use cases which could experience a disproportionate impact are:

  • Large device volumes with inter-device communications. Here, a more computationally expensive key exchange mechanism could have a large impact
  • Large message volumes, where authenticity of the messages matter. Potentially slower signature algorithms could have various impacts on the performance of your system

The news is not all bad. While this does demand that you build devices with greater memory and storage resource, there are many favourable things which mean you can do this now. Resources are getting cheaper, so the new overheads are going to be affordable in many cases. Work is being done to integrate the new algorithms into common tools and libraries such as OpenSSL, meaning you shouldn’t need to migrate to vastly different tools and tech than you are already familiar with.

Preparing for the PQC switch

It’s time to prepare for the switch to PQC. It starts by asking the question, ‘is this a product risk which warrants protection against similar future events?’ This is essentially crystal ball gazing. How will the device be used in the future? What will it be connected to? Will it be used in the benign use case that I envisage now, or will it be used in ways I don’t expect and need to protect against? How likely is it to be around in 10 plus years? Is the architecture I am laying down now the basis of a product line for the future – that is, even if a device life is short, am I using the same base hardware architecture in all my future devices?

The chances are that many of the devices we will be designing in the coming years, will be around in the Quantum world in some form. So, can you design devices now and optimise them for the future threat? Or even design in the ability to adapt to future threats? The answers are yes, but not straight forward, as I’ve shown.

The thoughts I’ve offered here are from our recent work on designing in PQC to devices. We must be prepared to meet any disruptions that could bring an impact to the security of our products. In fact, the US National Cyber Security Centre of Excellence has released a NIST whitepaper on the subject. which strongly recommends companies should plan and prepare for this transition now.

If you’d like to discuss any of the topics raised, do email me. It would be great to continue the conversation.

Author
リアム ロンバード
ソフトウェアエンジニア

ソフトエンジニアとして、主にシュミレーションや制御システムに重点を置き、お客様のビジネスプランを実現するため、遺伝的アルゴリズム、安全なアーキテクチャや概念の創出にも携わる。テクノロジーを斬新な方法で組み合わせ、有益な製品を生み出すことに注力している。