While the current generation of post-quantum cryptographic (PQC) standards remain secure, this achievement underscores a deeper truth: no cryptographic system stands the test of time unchallenged.
NIST’s Standardisation and the Strength of ML-KEM
The NIST PQC standardisation process, which began in 2016, has been among the most rigorous cryptographic evaluations ever conducted. The finalisation of ML-KEM (based on CRYSTALS-Kyber) as the leading post-quantum key encapsulation mechanism reflects years of global peer review, side-channel resistance testing, and structural scrutiny. ML-KEM was chosen not because it was perfect, but because it was, at the time, the most battle-hardened and scalable candidate.
But history has taught us that no cryptographic algorithm remains unbreakable forever. RSA, once dominant, now falls to quantum algorithms like Shor’s. SHA-1, widely deployed until recently, succumbed to practical collision attacks. Even elliptic curve cryptography has had its parameters and curve choices questioned and revised. Cryptography evolves under pressure, and eventually, pressure always arrives.
What the SVP Breakthrough Means for Lattice Security
SVP is a foundational hard problem in lattice-based cryptography, which underpins the security of both ML-KEM and ML-DSA. The XJTLU team’s successful solution of a 200-dimensional SVP instance is not a direct threat to these schemes, which typically rely on dimensions in the 400–600 range, but it is a significant signal.
Professor Ding rightly points out that solving higher-dimension SVPs becomes exponentially harder. Each increase of 10 dimensions roughly increases the computational difficulty by an order of magnitude. So 400-dimensional SVP problems are not simply twice as hard as 200; they may be thousands of times harder.
However, this achievement was made without quantum computers, and without artificial intelligence assistance. This was purely the result of optimized classical algorithms and academic compute clusters. And while the difficulty of SVP scales exponentially, so does global computing capability. Moore’s Law may be slowing, but distributed computing, specialized hardware, and algorithmic innovation continue to accelerate.
In short: this isn’t a crisis. But it is a warning shot.
Why Betting Everything on Lattice Is a Strategic Risk
Lattice cryptography has earned its place as a PQC frontrunner. But its dominance also presents a risk: monoculture. If the cryptographic world adopts lattice-based primitives exclusively, a future breakthrough in lattice problem-solving could have systemic consequences.
This is not hypothetical. Cryptographic history is littered with once-dominant algorithms that fell quickly when their core assumptions were invalidated. In cybersecurity, redundancy and diversity are essential. We would never secure a system with a single layer of defence, so why secure global communications with a single family of math?
Exploring the Alternatives: Code-Based Cryptography and HQC
Fortunately, NIST is already considering diversification. In its ongoing fourth round of PQC standardisation, code-based schemes such as HQC (Hamming Quasi-Cyclic) and BIKE are under active evaluation.
Unlike lattice-based schemes, HQC relies on the hardness of decoding random linear codes, a problem with a distinct security foundation. It is immune to the types of lattice optimizations that make SVP solving possible, and represents a strong contender for future standardisation.
While it would be premature to switch everything to HQC today, it would be reckless to ignore it.
ExeQuantum’s Response: A Future-Proof PQC Stack
At ExeQuantum, we’ve taken this risk seriously from day one. That’s why we became the first company in the world to integrate HQC support into a production API infrastructure, alongside ML-KEM.
Our system is designed for cryptographic agility. Clients can use ML-KEM by default, but can switch to HQC, test hybrid modes, or prepare fallbacks without rebuilding their stack.
This dual-algorithm approach ensures that organizations aren’t just compliant with today’s standards, but prepared for tomorrow’s threats.In a world where computational leaps can blindside entire industries, being early isn’t overkill, it’s resilience.
The Takeaway: Plan for Agility Now
ML-KEM is still solid. The SVP record doesn’t break it, but it does break the illusion that any scheme is future-proof.
That’s why cryptographic diversity must be built into our systems now. Not when it’s urgent. Not when regulators catch up. Now.
Organizations that build agility into their cryptographic infrastructure will be the ones that withstand the next paradigm shift. Those who don’t may find themselves stuck with the next broken standard.
At ExeQuantum, we’re not just implementing today’s post-quantum encryption. We’re preparing for the world after it.