How to Keep Secrets in a Post-Quantum World

Sophie Maclean

The list of panelists at the “Post-Quantum Cryptography” panel at the 9th HLF read like a who’s who in computer science.

The first to join the panel was Vinton ‘Vint’ Cerf, often referred to as the “father of the internet” and a rockstar of computer science. He works as an “evangelist” at Google – a job title he chose himself. Vint Cerf also won the 2004 ACM A.M. Turing award for his work on internetworking, including the design and implementation of the internet’s basic communication protocols (TCP/IP) and also for his inspired leadership in networking.

Prior to the panel, Whitfield Diffie gave a speedy run through of modern cryptography. Diffie, who received the 2015 ACM A.M. Turing Award for his eponymous Diffie-Hellman key exchange method (among other things), also sat on the panel. Alongside him was Adi Shamir, who won the 2002 ACM A.M. Turing Award for his work in co-developing the RSA algorithm.

The panel was completed by Vadim Lyubashevsky and Gregor Seiler, both research scientists at IBM Research Europe in Zürich. Their work deals with the theory and practice of quantum-safe computing and they both made significant contributions to the design of some of the schemes recently selected as the upcoming quantum-safe standards for public-key encryption.

On a stage sits five white men. From left to right they are Tom Geller, Whitfield Diffie, Vinton Cerf, Avi Shamir, Vadim Lyubashevsky, and Gregor Seiler. The camera is just beyond Gregor so is looking at the panel from the right.

From left to right: Tom Geller, Whitfield Diffie, Vinton G. Cerf, Adi Shamir, Vadim Lyubashevsky, Gregor Seiler. © Heidelberg Laureate Forum Foundation / Flemming

What is post-quantum cryptography?

Before we even begin to ponder a “post-quantum world”, we need to agree on what post-quantum actually means. Put most simply, “post-quantum” in the context of cryptography means being resistant to attack by quantum computers, i.e. a system or algorithm leading to encryption that a quantum computer is unable break.

The problem with this definition, however, is it brings up a rather uncomfortable question: How do we know that an algorithm can be broken? This doesn’t just apply to quantum computing either: we currently have no way of guaranteeing that a classical computing algorithm is secure. The best we can do at the moment is assert that we haven’t been able to break an algorithm yet and hope that remains the case. As Vadim Lyubashevsky put it: “We’ve worked on these problems. We think they’re hard. This is the best we can do.”

This may sound slightly terrifying to you. It did to me. A shiver ran down my spine when Whitfield Diffie pointed out that World War 2 saw the breaking of many algorithms that had previously been thought to be unbreakable. There is a silver lining, however. We do not actually need an algorithm to be impossible to crack. All we need is for an algorithm to be difficult enough to crack that it cannot be done in a useful length of time.

What do I mean by this? Well, each secret has a lifespan. I no longer care if anybody knows who I had a crush on when I was ten. This means that the cipher I used to write a note telling him I had a crush on him (I wish I was joking) only needed to remain safe from attack for fifteen years. Even government secrets have a point in time when they can be declassified, although this typically takes a bit longer. As long as the encryption algorithm takes longer to break than the expected lifetime of the secret, it is for all intents and purposes “secure”. Musing on this led Vint Cerf to propose a new idea of what “secure” should mean: “A relative work factor that postpones the breaking long enough that the information is no longer useful.”

P = NP?

I want to take a bit of an aside here to talk about complexity theory. Complexity theory is the branch of mathematics and computer science that investigates how much time or space is required for an algorithm to run, usually as a function of the size of the input to the algorithm (usually denoted by \(n\). There is a famous unsolved problem in complexity theory: P versus NP.

P stands for “polynomial time”. An algorithm is said to run in polynomial time if the runtime of the algorithm is a polynomial function of the input size \(n\). A problem is said to be P, if there exists a polynomial time algorithm that solves it. NP stands for nondeterministic polynomial time, and problems are said to be NP if a solution can be checked in polynomial time.

All P problems are also NP. The P versus NP problem asks if all NP problems are also P i.e. whether or not P and NP are equivalent. This problem was first formally introduced in 1971 by Stephen Cook. Cook won the 1982 Turing Award “For his advancement of our understanding of the complexity of computation in a significant and profound way.” If that wasn’t enough to show how important this question is, in the year 2000, the Clay Mathematics Institute selected it as one of its “Millennium Prize Problems”. These are seven problems which each carry a $1,000,000 prize for the first person to solve them. So far only one of the problems has been solved.

An important example of a problem that is NP – but might not be P – is prime factorisation: We can easily check a potential prime factorisation by multiplying the prime factors together. We do not yet have a polynomial time algorithm that can factorise numbers, but we can’t say for certain that one doesn’t exist. The reason that this example is so significant is simply: The fact that prime factorisation is hard forms the foundation of the main cryptosystem used in encryption today (the RSA cryptosystem). If it is proven that \(P=NP\), this would prove that all of modern cryptography is vulnerable to attack.

Thankfully, I have some good news. While we do not have proof that \(P \ne NP \), the overwhelming consensus is that this is the case. William Gasarch, an American computer scientist, has conducted multiple polls of researchers on this topic. In 2002, 61% believed that \(P \ne NP \). By 2012, this had risen to 83% and it rose even further to 88% in 2019. Better still, when restricted to only experts, 99% of those surveyed in 2019 believed \(P \ne NP \).

This means that the biggest threat to modern cryptography is almost definitely quantum computing. You may, at this point, be wondering whether we could just adapt our current algorithms to become better. Diffie also had this thought during the panel, and questioned Shamir about whether RSA could just steadily outrun things, simply by increasing the number of bits. Shamir’s answer was clear: “If you have a secret that you want to maintain for the next 100 years, please don’t use RSA.”

Quantum computing is so powerful because it allows a high level of parallelisation so algorithms that aren’t classically polynomial time can be adapted, and new algorithms can be created that are polynomial time. In fact, there already exists a polynomial time quantum algorithm that can carry out prime factorisation, known as Shor’s algorithm. All that remains is to build the quantum computers themselves. It should be noted that we do actually already have some quantum computers, but they are relatively small in terms of the number of qubits, so none are capable of such a calculation yet.

How worried should we be?

Once we have more advanced quantum computers, we’re in big trouble – any messages we send using current cryptosystems be vulnerable to decryption. This might not be something we need to worry about for a while though.

A unique benefit of the Heidelberg Laureate Forum is the opportunity to get some of the brightest minds in their fields together to discuss problems such as this, and to give their views. Tom Geller asked the panellists when they thought post-quantum computing would be needed. It was a relief to hear Vint Cerf opt for “within the next 20 years,” while Shamir, Lyubashevsky and Seiler didn’t think that it would be needed even then, opting for “within the next 30 years.”

There are many reasons we do not yet have this technology. The main challenge is largely agreed to be quantum decoherence. Quantum states can be thought of as a probability distribution of each possible state being measured. A pure state gives a particular measurement with probability \(1\), and all other measurements with probability \(0\). Decoherence is when this probability distribution “spreads out” (to use a non-technical term), and the state becomes impure to a greater and greater extent. With decoherence comes loss of information.

One way to deal with loss of coherence is to use multiple qubits to represent one logical qubit: So if \(1000\) logical qubits are needed, the machine should have, say, \(1,000,000\) qubits. This is quite a difficult machine to build, and even if you create many smaller machines, you’re then faced with the problem of how to link them.

In Whitfield Diffie’s introductory talk, he had mentioned how quantum computing had been promised for the past 30 years but not yet realised. He described how cynics think it’s like controlled nuclear fusion – always promised but never coming – whereas optimists always think it’s just round the corner. Adi Shamir added weight to this, pointing out that IBM had previously predicted that by 2023 we should have a more than 1000-qubit machine, but at the time of the talk, it was September 2022 and the best available was a 53-qubit device. This is a far cry from the potential 1,000,000-qubit machine we might need.

How do we stay safe?

Suppose then, that quantum computers will be developed in 30 years time. At what point should we switch to quantum-resistant algorithms? These algorithms are already being developed. There are quantum-safe systems available now, and even some hybrid schemes. In fact, Lyubashevsky and Seiler themselves have been working at IBM to create post-quantum systems and algorithms.

The first thing to note is that although we don’t currently have quantum computers capable of decryption, any messages sent now could be stored to be decrypted once we do have the means. This means that we can’t necessarily afford to wait long before switching to post-quantum computing. Given that the panel predicted we have between 20 and 30 years until such technology is developed, any company or government with long-term secrets that need keeping beyond this time should be worried.

There is a technical point here, which Vint Cerf was keen to point out. The things likely to be cracked are the methods of key distribution, not the encryption itself. As Diffie explained in his introductory talk, the keys are what is used to encrypt the information, as opposed to the information itself. It is therefore possible that the messages will still be safe unless the keys are stored alongside the messages. Cerf admitted he believed that – for high-level secrets – so long as parties didn’t use public key encryption, they were safe. But this doesn’t mean he’d recommend it. “I also think a brick is safe. It just sits there and doesn’t do anything. It’s very secure,” he quipped.

Another important factor to consider when debating the switch to post-quantum computing is how distributed the world currently is. Nowadays, there’s a general mistrust among the global population towards institutions, which has led to a growing movement for people to want to be responsible for their own data. There are already many such distributed systems, such as blockchain, which is most famously used by Bitcoin. In distributed systems, everyone is in charge of their own key so they would have to switch to quantum-safe methods themselves. Gregor Seiler argued that “it takes a very long time to deploy cryptography in large systems such as the internet” and hence, in his opinion, the switch should be made now. The panel unanimously agreed that, even if current communication systems aren’t by default quantum-resistant, they should be designed in such a way that there is an ability to switch easily.

On a stage sits five white men. From left to right they are Tom Geller (but we can only see the back of his head), Whitfield Diffie (who is hidden behind Tom), Vinton Cerf, Avi Shamir, Vadim Lyubashevsky, and Gregor Seiler. The camera is just beyond Tom so is looking at the panel from the left.

From left to right: Tom Geller, Vinton G. Cerf, Adi Shamir, Vadim Lyubashevsky, Gregor Seiler. © Heidelberg Laureate Forum Foundation / Flemming

What’s next for cryptography?

It might seem that in terms of cryptography, everything is now sorted: We have quantum algorithms that can break modern cryptography, and we have quantum cryptosystems that are secure, as far as we can be confident. There is still much to discover, however. This back and forth between encryption algorithms and decryption algorithms is always going to keep going, and there’s plenty of research to be done.

For a start, there is still more work to be done in the realm of classical computing. Quantum cryptography won’t be around for a while, so it’s important that algorithms are safe from attack from classical computers, too. Adi Shamir would like to see a move away from the current lattice-based schemes. Most of the encryption schemes that remain unbroken are based on lattices, so if a general approach can be found, then that leaves us very vulnerable. On the other hand, Shamir jokingly added that those wanting to work on breaking schemes should focus on lattice-based ones.

How one creates a non-lattice based scheme is a trickier problem. If any of the panelists knew, you could be sure that they’d have done it themselves. Some ideas that were floated, however, were looking into systems of solving algebraic equations, or looking at diophantine equations, to see if they could be adapted in any way. Another suggestion posed was utilizing non-linear cellular automata.

Another area of research that remains less explored is that of virtual quantum computers. It’s possible to simulate a quantum computer on a classical one, but at the moment it requires a large amount of time or memory that grows exponentially with the number of qubits. Given a virtual quantum computer, one could also look into using quantum computing to solve hard problems. I’ve mentioned before that quantum computers can carry out prime factorisation in polynomial time, but there are many other difficult problems that haven’t yet been explored with respect to quantum computers.

Never stop learning

Throughout the panel, the audience learned a vast amount about how cryptography is conducted today, what the future of cryptography may look like, and the challenges we will face when quantum computers capable of decryption become a reality. The audience weren’t the only ones to learn something though. Over the course of the hour, the laureates quizzed each other on their areas of expertise, asking insightful questions to their peers.

The final gem of wisdom was given by Adi Shamir, and was directed at the audience and panel alike. He recommended that anyone interested in learning more should look into QISKIT.org – a free online introduction to quantum cryptography – and should read the lecture notes from Scott Aaronson’s undergraduate quantum computing course, which he has made freely available on his website.

At this point, Vinton Cerf asked the panel for book recommendations on quantum computation, and the tone of the panel became a lot more conversational. It was wonderful to see the laureates talk among themselves, and it was hugely inspiring that the panel ended with someone as esteemed as Cerf asking how he can learn more. That, perhaps, was the most important lesson of the entire week.

The full panel discussion, along with the entire scientific program of the 9th HLF, can be viewed on the HLFF YouTube channel.

The post How to Keep Secrets in a Post-Quantum World originally appeared on the HLFF SciLogs blog.