Between hype and hard math: your quantum plan
Every few months, I hear about “Q-Day”: the moment when quantum computing will break through all security. Breathe easy. The threat is real. But it's not what the headlines suggest.
First, let's ask the question: what exactly does quantum “break”?
Let's dive into the theory for a moment. With Shor's algorithm, you can theoretically crack asymmetric cryptography (RSA, ECC). To do so, you need a large-scale, fault-tolerant quantum computer: millions of physical qubits with heavy error correction, stable enough to calculate for hours on end.
What we have today are NISQ (Noisy Intermediate-Scale Quantum) machines: hundreds to thousands of qubits, short-lived coherence, lots of noise. Clever research tools, but no lock-cracking devices for the internet.
Conclusion: as long as we don't have the right machinery, there will be no Q-day for the time being.
Will Q-Day ever come? Okay, the chance is not zero. Progress is real: better qubits, less noise, smarter error correction. But putting a calendar date on “millions of stable qubits” is guesswork. That's why the most common estimate – “another 10–15 years” – has been pushed forward for years. It feels like an IT project with a very elastic deadline.
What is the current situation?
Two lines are moving simultaneously. On the one hand, laboratories and suppliers are developing faster, but they are still far from “breaking RSA-2048 in practice.” On the other hand, the line of defense is already shifting: post-quantum cryptography (PQC) is being standardized and rolled out. Symmetric cryptography (such as AES) remains robust with longer keys; it is mainly key distribution and digital signatures that require attention.
The important question here is: what does this mean for you? Should you take action now on a phenomenon whose deadline keeps getting pushed back?
Treat quantum as a long-tail risk with short-term actions. Take stock of where you use asymmetric cryptography (PKI, TLS, VPN, code signing). Determine which data has a long confidentiality period – think ‘harvest now, decrypt later’. Build crypto agility into your systems so that you can migrate without major renovations. Start pilots with PQC for the most critical flows. In other words: don't panic, but do plan. This will keep you ahead of the curve, with realism instead of noise.
What is the actual risk you are taking?
The greatest threat posed by quantum computers to security lies in asymmetric cryptography. Think of RSA, elliptic curves (ECC), and the digital certificates that make the web secure every day. Once a large-scale, fault-tolerant quantum computer becomes available, it will be able to break these techniques in seconds. The result? You can no longer blindly trust the “padlock” in your browser: the guarantee that you are communicating with the correct website will immediately disappear.
Symmetric cryptography – such as AES or the BitLocker encryption on your laptop – is a different story. Quantum computers do not provide a “magic highway” to the key here. In theory (!), they halve the search space, but a simple increase in key size stops them. Symmetric encryption therefore remains usable, provided you choose the right key length.
However, there is a catch. In practice, symmetric keys are often agreed dynamically via protocols such as Diffie-Hellman or RSA key exchange. And it is precisely this negotiation step that is vulnerable: a quantum computer can calculate it back at lightning speed. This renders the symmetric lock worthless, no matter how strong the key is.
The implication is clear: encryption that seems secure today may be unsafe tomorrow. Everything you store or send now can, in theory, be intercepted and later decrypted using a quantum computer. This is known as the “harvest-now, decrypt-later” risk.
Fortunately, the line of defense is already in place. Post-quantum algorithms have been developed that are resistant to known quantum attacks. NIST and other standards organizations are working to make them widely available. You can already test and implement them now, so that sensitive data that needs to remain secret for a long time—medical records, intellectual property, strategic documents—is quantum-proof before Q-Day ever arrives.
The lesson: don't panic, but migrate in time.
Five Nobel Prizes, zero applications: the story of superconductivity
In the 1980s, there was an almost religious belief in superconductivity. Scientists and engineers—and myself as a 17-year-old—were convinced that it would completely transform the computer industry. The logic was seductively simple: in theory, a superconducting switch can switch much faster than a semiconductor and does so with less energy per operation. Magazines such as Natuur & Techniek painted a future in which chips would not only be more powerful, but also extremely efficient.
The reality proved more stubborn. Despite prototypes and spectacular breakthroughs, practical application failed to materialize. To this day, the “warmest” superconducting material does not work above –180 ºC. Cooling to those temperatures requires enormous amounts of energy, far more than the gains achieved through superconductivity. In other words, the energy balance is not right. It costs much more than it yields.
Since the discovery of superconductivity in 1911, five Nobel Prizes have been awarded for groundbreaking research into superconductivity. But all that intellectual and financial investment has so far failed to open up a route to large-scale commercial applications in IT. Superconductivity remains a laboratory phenomenon for the time being. Perhaps it will one day be useful in niche applications such as space travel, where extreme cold is naturally present – but not in your laptop or data center.
And that is the irony: many of the quantum computers currently being developed do rely on superconductivity. Or they use alternative technologies that also require extreme cold. The result: enormous energy requirements and high costs for every quantum calculation.
The lesson of superconductivity is clear: technological promises often sound better than they turn out to be, and this is no exception. The hype may be great, but the reality is... well, colder – literally and figuratively.
Calculation errors: the Achilles heel of quantum
For enthusiasts, a quick note on why quantum computers are not yet working. The biggest challenge for the current generation of machines is not speed, but reliability. Qubits are notoriously fragile: they quickly lose their coherence and react to the slightest external noise. The result? Calculations are full of errors and must be constantly checked.
The most promising solution is called error correction. This involves bundling many physical qubits into a single logical, reliable qubit. In practice, this means that today you need about a hundred physical qubits to make one usable one. And that number can be even higher, depending on the technology.
Researchers are slowly making progress. Qubits are becoming slightly more stable, and error rates are falling cautiously. But the question is: how far will this trend go? Every step forward requires more effort, more money, and more computing power. The law of diminishing returns is also inexorable here.
As long as error correction does not improve dramatically, the dream of large-scale quantum computers will remain just that for the time being. Calculation errors are the Achilles heel that keeps quantum computing small.
Eight qubits do not make a quantum revolution
Those who talk about “quantum supremacy” often “forget” the scale issue. Meaningful applications—such as breaking cryptography, simulating molecules, or optimizing complex logistics—require millions of reliable qubits. And reliable is the key word here.
Due to high error rates, a single logical qubit today must be constructed from about a hundred physical qubits. Do the math: to get a few million reliable qubits, you need hundreds of millions of physical qubits. That is a scale far beyond our current reach.
By way of comparison, Microsoft recently presented a quantum computer with... eight qubits. Eight. That's like wanting to grow a village of eight inhabitants to the size of the European Union at a serious pace. You can't do that with a few smart expansions; you have to reinvent your entire architecture and infrastructure. (And a little more “creativity” ;)
In short: quantum computers are growing, but they are still in their infancy. The path from a handful of qubits to millions of stable computing points is not a matter of years, but of radical technological leaps.
Why wait for quantum when it's already possible?
I find the prediction that we will have a fully-fledged quantum computer in fifteen years' time rather optimistic. The history of superconductivity shows how often promises get stuck in laboratories. Before Q-Day actually arrives, dozens – perhaps even hundreds – of fundamental physics problems will have to be solved. And perhaps that will never happen.
But one thing is certain: you need speed now. Your applications cannot wait for a hypothetical technology that may or may not break through someday.
For anyone who wants speed NOW: at Sciante, we have already cracked the nuts that matter to your business. We accelerate applications today by a factor of 100 or more – without additional hardware or expensive cloud resources. This is not a pipe dream, it is a proven reality. Our customers are very happy with it and wish they had found us sooner. In addition to saving time, it also saves a lot of money.
Would you also like to know how your IT can become quantum-fast right now? Then make a no-obligation appointment with me.
Sources: