The Promise and Threat of Quantum Computing

Quantum computing's threat to encryption

Quantum computing's threat to encryption

Quantum computing promises future information security, but simultaneously threatens all information currently protected by 2048-bit RSA encryption. It is time to evaluate the threat and examine possible solutions.

This requires exploring the current state of quantum computing, examining the threat to RSA, and considering developments in quantum-proof encryption capable of mitigating the future risk. We talked to Professor Frank Wailhelm-Mauch, a theoretical physicist working on quantum computing and head of the quantum solid state research group at Saarland University. He is a key figure in the European Union’s €1 billion, 10-year Quantum Flagship project. He is coordinating a task to build a 100-qubit computer (called OpenSuperQ) that can then be used in further research.

The quantum state

In October 2019, Google announced it had achieved ‘quantum supremacy’, meaning it had succeeded in its own project called ‘quantum supremacy’. It was a major achievement – a 54-qubit processor that solved a predefined problem in 200 seconds. Google computed that it would take the world’s current fastest classical supercomputer 10,000 years to produce the same output.

The world took note. Is quantum computing closer than we thought? Is current encryption more endangered than we had realized?

Quoting an article published in 2011 by Computerworld, the Cyberspace Solarium Commission wrote in March 2020, “Today, classical computers working together and testing 1 trillion keys per second to break that same encryption key would need as much as 10.79 quintillion years, or 785 million times the age of the known universe. However, a quantum computer could perform the same task in about six months.”

The purpose of this statement is to galvanize the government into increasing its quantum research so “that U.S. research remains ahead of that of other countries, particularly China,” and to start using quantum-proof encryption in sensitive communications as soon as possible.

But it is a meaningless statement. It says how long a specified classical computer or computers would take to perform an unspecified task, and then compares it to an unspecified quantum computer of unspecified size and power in the unspecified future.

It is time to evaluate the current status of quantum computing

The power of quantum

The potential of quantum computing can be seen by comparing it to classical computing. Classical computing no longer relies on processor clock speed, as it did 20 years ago, but now relies on the number of cores in close proximity that can be harnessed to process data in parallel. 

If you compare the processors used in a classical supercomputer to those in a laptop, they are somewhat better but basically similar. The primary difference is the number of cores that can be used in parallel – for example, it was claimed at the 2016 Supercomputer Conference in Frankfurt that China’s Sunway TaihuLight supercomputer had 10,649,600 computing cores.

Put very simply, in classical computing, power comes from parallel processing which comes from linking separate processors. Parallel processing is external to the individual processor.

This principle is reversed in quantum computing, which uses the quantum mechanics principle that a single object can be in multiple places at the same time. This multiplicity can be translated to variables which can in principle be different variables in a single algorithm that can be processed simultaneously. In other words, in very simple terms, the parallel processing needed for computer power is now moved inside a single processor.

“In this sense,” Wilhelm-Mauch told SecurityWeek, “the quantum computer is the ultimate parallel computer. Because the number of parallel calculations that are in principle possible grows exponentially with the number of bits. It would take a classical computer an exponential number of parallel processor cores to catch up.”

But there are still enormous problems for the quantum engineers to solve. Not least of these is that quantum computers only provide probable results. “When a quantum variable has multiple variables at the same time and you wish to read it out, you’re getting all of these parallel calculation results with some probability,” said Wilhelm-Mauch. “The trick is now to find a quantum algorithm that on the one hand uses this massive parallelism and that on the other hand gives you the right answer with an acceptable probability. Ideally, that probability should be certainty, but 50% is also fine – just run it a couple of times. This is the art of writing quantum algorithms.”

It turns out that the main challenge for quantum computer hardware is not so much the number of quantum bits or qubits, it is the probability of error of operation. In classical computer hardware, he said, “this possibility is mind-bogglingly low. If you are running a classical computer, only in the most extreme security considerations (like managing a nuclear power plant) would you even think about potential hardware failure because software failure will always occur first. For quantum computers it is very different, first of all because it is very experimental hardware, but also because there are intrinsic aspects of the fragility of quantum mechanics – this opens a lot of routes for error and is why the error rate of quantum operations is very high.” 

(It should be noted, without questioning the enormity of Google’s quantum supremacy achievement, that the test used was purpose-built for the product it had.)

Wilhelm-Mauch believes that over time the error rate will probably go down by a factor of around 100, but this will still not be good enough to run even short algorithms with an acceptable error rate. “However,” he added, “there is a field of quantum error correction, which is related to the error correction that we do in classical communication. We use redundancy to spot errors using simple things like checksums – if the checksum is wrong, there must be a mistake somewhere. Quantum error correction requires that your error rate of your hardware is low enough every time. You can then, by redundancy, by putting a lot more sugar into your system, also run short algorithms.”

The RSA problem

The current security of RSA key encryption is based on the almost impossibly hard problem of factoring very large numbers. Although there are algorithms that can do so, the amount of computing power necessary is immense and easily increased – the effort to break the key grows nearly exponentially with the key size. 

To put this in context, explained Wilhelm-Mauch, “if you had a classical supercomputer able to break RSA 2048, you could in principle just add 2 digits to the key size and you would make it 4 million times harder to break; that is, it would take 4 million times longer. For this reason,” he added, “any arms race between decryptors with classical supercomputers and encryptors will always be won by the encryptor.”

This principle that encryption will always remain ahead of the ability to build supercomputers with more and more separate cores operating in parallel and capable of decryption does not apply to quantum computers where the parallelism is effectively increased within the processor.

In theory, powerful quantum computers will be able to defeat current RSA encryption relatively easily. There is even a quantum algorithm, that satisfies the necessary accuracy requirements, already available. This is Shor’s algorithm, invented in 1994 by the American mathematician Peter Shor. Shor showed that with quantum computers, factorization and discrete logarithms can be solved in polynomial time. A quantum computer can use its inbuilt parallelism to drastically reduce the time taken to crack the key. The algorithm “uses massive quantum parallelism but in the end gets a single useful answer with near certainty,” said Wilhelm-Mauch. “In principle, RSA will be endangered by a 25-year-old algorithm.”

But not yet. Let’s go back to Google’s quantum supremacy, which was an amazing achievement to develop a 54-qubit processor. “If we extrapolate current progress in hardware, number of qubits and error rate,” says Wilhelm-Mauch, “there is still a very long way to go. The qubit error rate needs to come down by a factor of 10 or even 100; and even if we are there, for making a meaningful attack on RSA that could be done in a day, we would need a billion qubits in a world where we now have fifty. We would need a billion qubits of an order of magnitude better than we have today.”

On our current knowledge of quantum development, RSA encryption is and will remain safe for many years to come. It will eventually be cracked, but the lifetime of most commercial secrets can be measured in just a few years. By the time current RSA is lost, most commercial secrets will naturally no longer be secret.

This doesn’t apply to national security secrets. Sensitive data on political decisions that remains sensitive for decades and are kept secret for many years will suddenly become vulnerable. This explains why governments are becoming increasingly vociferous on the need to move to quantum-proof encryption methods sooner rather than later. “The work being done, such as that by NIST, working on classical encryption algorithms that cannot be broken by quantum computing should be taken seriously,” warns Wilhelm-Mauch; “but we don’t need to switch to panic mode. Nevertheless, figuring out the right vendors and agreeing and implementing on them should converge on an acceptable timescale.”

Agency quantum computers

All of this begs one major question: can we be certain that large government agencies do not already have their own secret quantum computer? Edward Snowden demonstrated that western intelligence agencies such as the NSA in the U.S. and GCHQ in the UK have hoovered up vast troves of internet traffic (GCHQ even had a project called ‘mastering the internet’).

In more recent years, China has been suspected of doing similar using its state-sponsored APT groups to gain access to telecommunications providers to steal communications. This may partly be behind the U.S. concern over China’s Huawei dominating the world’s 5G networks. It is sometimes dismissed as a concern since even with compromised Huawei equipment, only meta data and no encrypted content would be vulnerable. 

But as soon as any state intelligence service has access to a quantum computer able to crack RSA, current communications will no longer be safe. And the question is, could any of these agencies have one?

Wilhelm-Mauch believes this is unlikely. “The distance between what is publicly known just in terms of cold, hard performance parameters, and what is needed for the task is so enormous that this is extremely unlikely,” he told SecurityWeek. The process would, he believes, have been visible – something he detailed in a paper (PDF) he co-authored for the BSI (the German Federal Office for Information Security) in 2019:

Assuming that the current technical challenges are met – somewhat better operations, sparse use of voluminous periphery, larger chip areas, interchip connects and upgrades to cryogenic technology—it seems to be possible to have a computer with a million planar transmons [Wikipedia] and a physical error rate of 1:10000. This would allow to attack 2048 Bit RSA in a few hundred hours. A faster attack (in one hour) would require to connect up to 1000 such units. This would require new technological solutions to connect these units—which have been demonstrated but currently would be too slow. Also, the initial filling of these machines with Helium 3 would require roughly the full annual industrial demand of Helium 3, likely requiring new nuclear facilities to produce this isotope. The financial and human investment in such an effort would be by far larger than current efforts in quantum computing.

The implication is that the ability to crack RSA-2048 in less than a fortnight is probable in the foreseeable future. The ability to do this in one hour is still very distant. Taking a fortnight to discover state secrets or high value intellectual property is a good return – taking a fortnight to discover commercial conversations is less so. Nevertheless, use of quantum-proof cryptography should be implemented sooner rather than later.

NIST and quantum-proof encryption

NIST has been working on quantum computer-proof cryptography (PQC) for the last five years. Following an April 2015 workshop that discussed post-quantum cryptography and its potential future standardization, NIST announced a ‘Request for Nominations for Public-Key Post-Quantum Cryptographic Algorithms’ in December 2016. The deadline date was November 2017, and NIST received 82 submissions from teams with members located in 25 countries (it only received 21 submissions for the earlier AES competition). Of the 82, NIST accepted 69, comprising 20 digital signature schemes, and 49 public-key encryption (PKE) or key encapsulation mechanisms (KEMs).

In January 2019, NIST announced that 26 algorithms (17 public-key encryption and key-establishment schemes, and nine digital signature schemes) had been selected to proceed to the second round of the standardization process. This was detailed in the status report NISTIR 8240 (PDF). Each submission is described with a brief commentary from NIST.

Candidates were given until March 2019 to make any tweaks to the algorithms, and NIST announced, “With the number of candidates substantially reduced from the first round, we hope that the combined efforts of the cryptographic community will evaluate the remaining candidates and provide NIST with feedback that supports or refutes the security claims of the submitters.” 

On July 22, NIST announced that its program entered the "Selection Round" that it said would help the agency decide on the small subset of algorithms that will form the core of the first post-quantum cryptography standard. 

In finding a suitable post-quantum encryption standard, it is useful to consider that the security of RSA encryption is based on two elements: the difficulty of the mathematical problem (factoring large numbers) that needs to be solved, and the assumed lack of any algorithm or method that can solve the problem with current classical computer technology. The quantum threat comes from the increase in computing power together with the existence of an algorithm (Shor’s algorithm) that can harness that power to solve the problem: both parts are necessary.

PQC consequently requires a new mathematical problem that is not susceptible to any new algorithms that can use quantum power to solve the problem. In reality, that is no different to RSA today – it assumes that there is no algorithm capable of breaking the encryption with classical computing in a meaningful time. (Shor’s algorithm is not general purpose – it offers no route to breaking anything other than factorization or discrete logarithm problems, and requires the parallelism of quantum computers.)

The task is incredibly complex, but the NIST competition makes a solution likely within the next few years. With current known advances in quantum computing, this is likely to be in time to protect future secrets – but existing secrets protected by RSA-2048 or even RSA-3072 may well become vulnerable within the next decade. All thanks to quantum computing and Peter Shor’s 25-year-old algorithm.

Related: Quantum Computing's Threat to Public-key Cryptosystems

Related: Facebook Awards $100,000 for Post-Quantum TLS Security Research

Related: US, German Spies Plundered Global Secrets Via Swiss Encryption Firm: Report

view counter
Kevin Townsend is a Senior Contributor at SecurityWeek. He has been writing about high tech issues since before the birth of Microsoft. For the last 15 years he has specialized in information security; and has had many thousands of articles published in dozens of different magazines – from The Times and the Financial Times to current and long-gone computer magazines.

Original Link