Americas

  • United States

Asia

Oceania

Peter Wayner
Contributing writer

What’s next for encryption if the RSA algorithm is broken?

Feature
Apr 06, 20218 mins
Data and Information SecurityEncryptionInternet Security

A recent, yet to be proven paper claiming to have found a way to "destroy the RSA cryptosystem" has cryptographers asking what might replace it.

Encrypted blocks of multicolored data cubes rolling out.
Credit: Matejmo / Getty Images

What if a big crack appeared overnight in the internet’s security layer? What if the fracture reached deep into the mathematical foundations of the cryptographic algorithms? That appeared to happen in early March when a paper dropped with a tantalizing conclusion in the abstract: “This destroys the RSA cryptosystem.”

If the claim proves correct, a good part of the data that’s encrypted at rest or in motion might not be safe. The first problem was that no one knew if the author was right. The second, even larger problem was no one was sure what the world should do if the claims were true.

At this writing, mathematicians are still deliberating the first question, but others are addressing the second question and starting to sketch out plans for what to do if a catastrophic weakness appears out of nowhere. They’re pushing for a stronger foundation built out of multiple algorithms implemented with protocols that make switching simpler.

Some cryptographers are looking for RSA replacements because the algorithm is just one encryption algorithm that may be vulnerable to new machines that exploit quantum effects in electronics. The world must be more agile, they argue, because there are many potential cracks that could appear.

Factoring large numbers

The paper is called “Fast Factoring Integers by SVP Algorithms” and it was written by Claus Schnorr, 77, a respected cryptographer who retired from Johann Wolfgang Goethe University in 2011. He is known, in part, for a digital signature scheme that carries his name. It was first patented in 1988 and is used in a modified form by some digital currencies because it is efficient and produces short signatures. The blockchains for Polkadot, Kusama and Zilliqa are just three examples. It relies upon the presumed intractability of the discrete log problem for its strength.

RSA is a different algorithm with a longer history and a broader adoption, at least in the past. It depends upon the complexity of factoring large numbers. Schnorr’s newly published approach, which has evolved in a series of papers published over the recent years, recasts the problem of factoring large numbers as one for what mathematicians sometimes call searching for the right vector in a multidimensional lattice defined by much smaller numbers.

His paper, though, is largely theoretical and that leaves many wondering whether a real implementation in software would deliver on these claims. Several people have circulated challenges that are large numbers constructed like RSA keys. Anyone who can attack RSA can prove it easily by publishing the factors to the number. So far, no one has managed such a concrete public demonstration, and some who have tried have announced that they couldn’t make the algorithm work.

Fixing RSA’s weaknesses

Fixing the problems that come from a newly discovered attack is nothing new. Software companies routinely ship patches that fix weaknesses, and they openly solicit bug reports to encourage people to report the problems they find. But the Schnorr paper, if proven, would expose weaknesses in the foundation of a protocol, and there’s no one corporation that takes responsibility for the protocol.

One company, also called RSA, shares quite a bit of history with the algorithm, but the patents have expired and now most implementations of RSA used throughout the internet did not come from them. Many of the popular libraries are open source and are maintained by a community.

To make matters worse, the weakness, if it exists, cannot be fixed with a few lines of new code like many holes or bugs. Any solution could take years to appear because it takes time to test and deploy any new algorithm.

Simply switching to a new algorithm may not be that hard because many encryption packages support options for using different algorithms with different key lengths. A deeper challenge may be updating the authentication infrastructure that maintains the certificate hierarchy used to validate the public keys. The current version of major browsers come with root certificates from the different certificate authorities and many rely upon RSA.

Replacing the root certificates in the browsers (and other tools) requires shipping new versions, something that can be tricky because the root certificates are so powerful. Some attacks, for instance, involve inserting fake certificates that will vouch for attackers, allowing them to impersonate other sites. At this writing, the certificates from some major certificate authorities like Verisign, Amazon, GoDaddy, and Entrust all rely upon the RSA algorithm.

Quantum questions

The question of what to do about potential quantum computers is another worry. The cryptography community is already well along the path of looking for replacements that can resist quantum machines, a path it started on several years ago because some fear that the computers may be available soon. These would threaten algorithms like RSA because one of the best-known algorithms for quantum machines developed by Peter Shor can factor numbers. The current machines are not very powerful and the largest number they’ve factored is 21, but many are planning for the possibility that more powerful models will appear.

This one algorithm is not the only threat because factoring large numbers is an attractive challenge, in part because of the economic value. One new paper from Élie Gouzien and Nicolas Sangouard, for instance, uses a special kind of quantum memory. Their design, which is also far from realization, might be able to factor the 2048-bit numbers used in RSA-based root certificates in about half a year.

Imagining a new, non-RSA cryptosystem 

Theoretical results like this are why the National Institute of Standards and Technology (NIST) started a contest in 2016 to identify potential replacements for RSA. The entrants were winnowed to 26 semifinalists in 2019 and the seven finalists in 2020. The selection process is still open because the committee also chose to bring along eight alternates for study. They hope to choose a replacement in the next several years, but the pandemic may slow this process.

The finalists are based on three different mathematical approaches. The ones that NIST’s announcement calls the “most promising general-purpose algorithms” use lattices and depend on the difficulty of finding one element or vector in this lattice. There are three different encryption schemes (CRYSTALS-KYBER, NTRU, and SABER) and two different signature algorithms (CRYSTALS-DILITHIUM and FALCON).

NIST also chose an older signature approach developed by Robert McEliece in 1978. It relies on the difficulty of finding the right solution to a general error-correcting code. These mathematical constructions are normally used in data storage and transmission to recover from errors, but McEliece found a way to change their construction so recovering from the error was hard for everyone but the people who held the right secret key. The last finalist is known as a Rainbow code and it constructs polynomials built out of many variables that are hard for an attacker to solve.

Several alternates were named to encourage more research into their foundations. Some rely on similar mathematical foundations to the finalists, but some use entirely different approaches. The SPHINCS+, for instance, builds signatures out of hashed values.

The new potential replacements for RSA may not be easily swapped with the existing ones. Some are much slower and others may not offer the same options for both generating signatures or encrypting the data. Many depend upon keys that are much larger and much of the active research is devoted to looking for variants that use smaller keys. It’s not uncommon to hear of keys that may be 500 kilobytes or more, substantially larger than many of the current keys that may be just several hundred bytes.

Whitfield Diffie, a cryptographer who helped create public key cryptography, notes that the new proposals could require more computation to set up. “We will probably have to do more caching,” he says. “If post-quantum systems are more computationally expensive than the current ones, you may have to spend a minute of computing to negotiate a key with eBay or Amazon, say, on 2 January, keep it all year, and do authentication classically.”

Current protocols typically negotiate a new key for each interaction. Running computationally more expensive key generation less often may result in the same overall cost of computation, says Diffie, but at the cost of “decreasing forward secrecy and diminishing the evidential value of signatures.”

Martin Hellman, another of the original mathematicians who began exploring public key cryptography in the 1970s, suggests that the world may want to develop new protocols that combine several distinct algorithms. Instead of simply relying on one algorithm to create the random key to encrypt some data, the protocol should run several algorithms and combine the keys from all of them into one key (perhaps via XOR or possibly with a more elaborate one-way function).

Hellman has been thinking about how to survive the collapse of an algorithm for more than a decade. Adding this kind of long-term security would prevent the potential panic that might result if a solid attack appears very suddenly.

But even if the catastrophic breaks don’t come, he points out that even the smaller increments built from the evolution of algorithms for factoring (or other attacks) can add up. A good plan for catastrophic collapse could also help defend against the slow increase in power that evolves over the years. 

“Some data that is secret today should still be secret 50 to100 years from now.” Hellman says. “So, an advance that far in the future would have dangerous implications for data protected today.”