Science & technology | Cryptography

The world needs codes quantum computers can’t break

America’s standards agency thinks it has identified three

A digitised, pixelated lock and key
Illustration: Nick Kempton
Listen to this story.
Enjoy more audio and podcasts on iOS or Android.

QUANTUM COMPUTERS, which exploit strange properties of the subatomic realm to crunch numbers in powerful new ways, do not actually work yet. But if and when they do start working, they will be able to break the cryptographic algorithms that currently protect online communications, financial transactions, medical records and corporate secrets.

Today’s algorithms generally rely on the fact that conventional computers struggle to factorise very large numbers. Finding the factors of the big numbers used by the RSA-2048 algorithm, for example, often used as a benchmark for progress in the field, has eluded generations of classical computers. But experts believe a quantum computer could emerge within a decade or two capable of cracking it in a day. That is already making cryptographers nervous. Sensitive data illicitly obtained today could be held on ice for years, until a sufficiently powerful code-breaker is built.

New algorithms are therefore needed. And because switching over to them will take years, the transition to post-quantum cryptography (PQC) needs to begin as soon as possible. The National Institute of Standards and Technology (NIST), America’s standards agency, has now fired the starting gun for this transition. On August 13th NIST announced that three algorithms had been approved as official standards for PQC. Two are based on lattice problems, a type of mathematical puzzle challenging for quantum and classical computers alike. The third, which is based on the hash functions used in data analysis, avoids having too many eggs in a single basket.

Chart: The Economist

The announcement marks an important step in a continuing process. NIST began looking for quantum-safe algorithms in 2016, when it launched a competition for codes that future quantum computers would be unable to crack. Dozens of algorithms were submitted, mathematicians and cryptographers did their best to pick holes in them, and many fell by the wayside. Eventually, in July 2022, NIST announced a shortlist of four algorithms that were candidates for standardisation. Three of them were based on lattice problems. The fourth involved hash functions.

NIST also said it would continue to evaluate four backup algorithms, some of which might be adopted as standards in future. That is because nobody can ever be sure how secure an algorithm really is; there is always a risk somebody might discover a clever way to crack it. NIST consequently chose backup algorithms that did not rely on lattices. One of these, called SIKE, is based on the mathematics of isogeny-based elliptic curves. Elliptic curves are already used in some cryptographic systems today, but are not considered quantum-safe. Isogeny-based elliptic curves, it was thought, would be.

Wrongly, as it turned out. In July 2022 Wouter Castryck and Thomas Decru, mathematicians at Katholieke Universiteit Leuven in Belgium, announced that they had found a way to crack SIKE. Worse still, their method could unlock data encrypted by SIKE in just four minutes, using a ten-year-old desktop PC. Fortunately, SIKE was the only example of an isogeny-based elliptic-curve cryptosystem under consideration by NIST, so this result did not imperil any other algorithms. Cue a big sigh of relief, and SIKE’s removal from the list of contenders for PQC.

Then in April 2024 came another unexpected result. Yilei Chen, of Tsinghua University in Beijing, issued a paper detailing a quantum algorithm that could solve certain lattice problems. This suggested that algorithms based on such problems might, after all, be vulnerable to quantum attack. Given that three out of four of NIST’s preferred algorithms were of this type, this was a potentially disastrous finding. Fortunately, a flaw was found in the paper almost immediately, and cryptographers sighed with relief once again.

One of NIST’s approved lattice-based algorithms, ML-KEM, is a method for distributing secret encryption keys, which allow the right recipient to decrypt the scrambled data. The other, ML-DSA, is an algorithm for digital signatures, a technique that allows users to prove their identity.

The third approved algorithm, SLH-DSA, is an alternative to ML-DSA based on a hash-based algorithm—“to avoid relying only on the security of lattices”, NIST explained. NIST will also continue to evaluate a trio of other algorithms, reliant on neither lattices nor elliptic curves, as possible alternatives to ML-KEM. They are thought to be highly secure, but require more storage space for encryption keys and enciphered data than ML-KEM does.

There is strength in such diversity. The scare over Dr Chen’s paper highlighted the fact that there has not been enough analysis of lattice-based systems to be confident of their security, notes Bruce Schneier, a cryptography guru at Harvard University. People have tried and failed to break lattice-based algorithms with conventional computers for decades, but there has been much less research into how they might be broken using a quantum computer. Adoption of the new NIST standards should go ahead, he says, but large organisations should aim to be “crypto-agile” as they switch to PQC. That means switching in a way that facilitates further switches in future, as better algorithms become available, or flaws are found in existing ones.

The work underpinning a successful transition has been under way for some time, says Scott Crowder, a quantum specialist at IBM, a computing giant. IBM has made a PQC software update for its Z series mainframe computers, for example, which are still widely used in many industries. Similarly, earlier this year Apple implemented ML-KEM on the iMessage service used on its iPhones, iPads and Macs.

For a typical large company, says Mr Crowder, 80% of the job of switching to PQC will be handled by vendors providing upgrades and patches. The other 20% is more difficult, requiring companies to rejig custom-built internal systems.

One approach, which can ease the transition and also provide extra reassurance, is known as “hybrid” or “composite” cryptography. This involves layering existing, conventional cryptography with PQC. That way, if either system is broken, the other still provides some protection. This can act as an insurance policy for organisations required by regulators to adopt PQC but worried that it may not be totally secure.

The possibility still exists that flaws will be found in NIST’s new standards. But that is no reason to delay. The transition will not be easy, and will not be risk-free. But the time to start is now.

Curious about the world? To enjoy our mind-expanding science coverage, sign up to Simply Science, our weekly subscriber-only newsletter.

This article appeared in the Science & technology section of the print edition under the headline “Code-switching”

How would she govern?

From the August 24th 2024 edition

Discover stories from this section and more in the list of contents

Explore the edition
Reuse this content
Subscriber only | Simply Science

Curious about the world? Enjoy a weekly fix of our mind-expanding science coverage

Delivered to you every week

More from Science & technology

Billionaire space travel heads for a new frontier

Flying on Elon Musk’s spaceship; sponsored by Doritos

Billionaire space travel heads for a new frontier

Flying on Elon Musk’s spaceship; sponsored by Doritos


Wildfires are getting more frequent and more devastating

Climate change is accelerating the blaze


Why a new art gallery in Bangalore is important for Indian science

It aims to make research and tinkering more accessible to the public

Climate change could reawaken harmful invasive plants

The sooner they can be weeded out, the better

AI scientists are producing new theories of how the brain learns

The challenge for neuroscientists is how to test them