NHacker Next
  • new
  • past
  • show
  • ask
  • show
  • jobs
  • submit
Cloudflare targets 2029 for full post-quantum security (blog.cloudflare.com)
rdl 1 days ago [-]
It will be interesting to compare PQ rollout to HTTPS rollout historically (either the "SSL becomes widespread in 2015" thing, or the deprecation SSL 3.0). Cloudflare is in an easy position to do stuff like this because it can decouple end user/browser upgrade cycles from backend upgrade cycles.

Some browsers and some end user devices get upgraded quickly, so making it easy to make it optionally-PQ on any site, and then as that rollout extends, some specialty sites can make it mandatory, and then browser/device UX can do soft warnings to users (or other activity like downranking), and then at some point something like STS Strict can be exposed, and then largely become a default (and maybe just remove the non-PQ algorithms entirely from many sites).

I definitely was on team "the risks of a rushed upgrade might outweigh the risks of actual quantum breaks" until pretty recently -- rushing to upgrade has lots of problems always and is a great way to introduce new bugs, but based on the latest information, the balance seems to have shifted to doing an upgrade quickly.

Updating websites is going to be so much easier than dealing with other systems (bitcoin probably the worst; data at rest storage systems; hardware).

jeroenhd 1 days ago [-]
If any kind of proof about serious quantum computers comes to light, browsers can force most websites' hand by marking non-PQ ciphers as insecure.

Maybe it'll require TLS 1.4/QUIC 2, with no changes but the cipher specifications, but it can happen in two or three years. Certificates themselves don't last longer than a year anyway. Corporations running ancient software that doesn't support PQ TLS will have the same configuration options to ignore the security warnings already present for TLS 1.0/plain HTTP connections.

The biggest problem I can imagine is devices talking to the internet no longer receiving firmware updates. If the web host switches protocols, the old clients will start dying off en masses.

bwesterb 1 days ago [-]
No need for a TLS 1.4.

Leaf certificates don't last long, but root CAs do. An attacker can just mint new certs from a broken root key.

Hopefully many devices can be upgraded to PQ security with a firmware update. Worse than not receiving updates, is receiving malicious firmware updates, which you can't really prevent without upgrading to something safe first.

jeroenhd 9 hours ago [-]
> An attacker can just mint new certs from a broken root key.

In Chrome at the very least, the certificate not being in the certificate transparency logs should throw errors and report issues to the mothership, and that should detect abuse almost instantly.

You'd still be DoSing an entire certificate authority because a factored CA private key means the entire key is instantly useless, but it wouldn't allow attacks to last long.

bwesterb 8 hours ago [-]
Yeah, PQ certificate transparency is crucial for downgrade protection: https://westerbaan.name/~bas/rwpqc2026/bas.pdf
22 hours ago [-]
GoblinSlayer 10 hours ago [-]
When you connect, you specify supported ciphers. If the server doesn't support them, there's standard "insufficient security" (71) error that was there since at least TLS 1.0, maybe earlier.
rocqua 7 hours ago [-]
Confidentiality of the TLS connection is indeed easy to handle here.

The hard part is certificate authentication. And that's not included in the cipher suite setting.

PunchyHamster 1 days ago [-]
There is no reason to not support non quantum safe algorithms for foreseeable future in the first place
greesil 24 hours ago [-]
You did not increase comprehension by not using a single negative.
ZiiS 13 hours ago [-]
They are slower, larger, and less tested. Specifically the hope was to develop hybrids that could also provably be more pre-quantum secure then what they are replacing. History dose not favour rushing cryptography.
bwesterb 8 hours ago [-]
They are large, but they're not that slow actually. We've been testing them for almost a decade now. I agree that rushing is bad. That's why we need to start moving now, so that we're not rushing even closer to the deadline.
Hendrikto 11 hours ago [-]
You misread the comment you replied to.
KAMSPioneer 10 hours ago [-]
Which, to be fair, is easy to do because they used a triple-negative.

Rephrased, they meant to say "there is no reason to remove support for quantum-vulnerable algorithms in the near future."

IMO that's much less likely to be accidentally misinterpreted.

bwesterb 1 days ago [-]
Waiting now means rushing even more close to the deadline! We added stats on origin support for post-quantum encryption. Not as much support as browsers of course, but better than I expected. Still a long road (and authentication!). https://radar.cloudflare.com/post-quantum
stingraycharles 1 days ago [-]
> Updating websites is going to be so much easier than dealing with other systems (bitcoin probably the worst; data at rest storage systems; hardware).

IPv6 deserves a prominent spot there

fc417fc802 23 hours ago [-]
Does it? That one is different because IPv4 with CGNAT largely "just works" except for P2P type stuff. As a result there's a strong incentive for anyone who has a working setup to just not care.

I can use myself as an example here. IPv6 is supported by all my hardware, all the software I use, and my ISP provides it. Yet my LAN intentionally remains IPv4 only with NAT. Why? Because adding IPv6 to my LAN would require nonzero effort on my part and has (at least for now) quite literally zero upside for me. If I ever need something it offers I will switch to it but that hasn't happened yet.

PQC is entirely different in that the existence of a CRQC immediately breaks the security guarantee.

GoblinSlayer 10 hours ago [-]
cetinsert 1 days ago [-]
You can do PQ queries with us at qi.rt.ht!

Which one do you think is PQ-secure?

https://qi.rt.ht/?pq={api.,}{stripe,paypal}.com

1a527dd5 1 days ago [-]
That is a beautiful api.
lexlambda 1 days ago [-]
> news.ycombinator.com:443 is using X25519, which is not post-quantum secure.

This is the result of Cloudflare's test "Check if a host supports post-quantum TLS key exchange" offered on https://radar.cloudflare.com/post-quantum.

Hoping there is already a migration plan. Fortunately many modern tools make it easy to switch to PQ, maybe someone knows which stack HN is running and if it would be possible.

Havoc 21 hours ago [-]
Wow that’s a lot better browser support than expected
MrRadar 1 days ago [-]
Along similar lines, Mozilla recently updated their recommended server-side TLS configuration to enable the X25519MLKEM768 post-quantum key exchange now that it's making it into actually-deployed software versions: https://wiki.mozilla.org/Security/Server_Side_TLS At the same time they removed their "old client" compatibility profile as newer TLS libraries do not implement the necessary algorithms (or at least do not enable them by default) and slightly tweaked the "intermediate" compatibility profile to remove a fallback necessary for IE 11 on Windows 7 (now Windows 10 is the minimum compatible version for that profile).
Bender 1 days ago [-]
Is this still theory or are there working Quantum systems that have broken anything yet?
tptacek 1 days ago [-]
Among cryptography engineers there was a sharp vibe shift over the last 2 months; there are papers supporting that vibe shift, but there's also a rumor mill behind it too. The field has basically aligned fully in a way it hadn't before that this is an urgent concern. The simplest way to put it is that everyone's timeline for a real-world CRQC has shortened. Not everyone has the same timeline, but all those timelines are now shorter, and for some important (based on industry and academic position) practitioners, it's down to "imminent".
xienze 1 days ago [-]
> The field has basically aligned fully in a way it hadn't before that this is an urgent concern.

AKA “we want more funding.”

dralley 1 days ago [-]
There's a simultaneous push coming from the government to support PQC, ASAP, so it's not just researchers pushing this.
bwesterb 8 hours ago [-]
You sure? Defenders get funding if things break—not when they actually did their job.
OkayPhysicist 1 days ago [-]
It's theory. The concern is for avoiding a (likely, IMO) scenario where the only real indication that someone cracked QC is one or more teams of researchers in the field going dark because they got pulled into some tight-lipped NSA project. If we wait until we have an unambiguous path to QC, it might well be too late.

To avoid the scenario where for a prolonged period of time the intelligence community has secret access to QC, researchers against that type of thing are incentivized to shout fire when they see the glimmerings of a possibly productive path of research.

rectang 1 days ago [-]
> one or more teams of researchers in the field going dark

If the intelligence community is going to nab the first team that has a quantum computing breakthrough, does it actually help the public to speed up research?

It seems like an arms race the public is destined to lose because the winning team will be subsumed no matter what.

OkayPhysicist 23 hours ago [-]
It's the same logic as any offensive technology: maybe the world would be a better place if we never invented the technology, but we can't risk our enemies having it while we don't, and even if they never develop it maybe it'll help us, and we're the good guys.

Luckily, in this particular arms race, all we the public need to do is swap encryption algorithms, and there's no risk of ending global civilization if we mess up. So we get the best of both worlds: Quantum computing for civilian purposes (simulations and whatnot), while none of the terrifying surveillance capabilities. We just need to update a couple of libraries.

fc417fc802 21 hours ago [-]
> It seems like an arms race the public is destined to lose ...

By what margin? An active push can minimize the gap.

However I think you're confusing the existence of a CRQC with adoption of PQC algorithms. The latter can be done in the absence of the former.

evil-olive 1 days ago [-]
still theory, but there seems to be an emerging consensus that quantum systems capable of real-world attacks are closer to fruition than most people generally assumed.

Filippo Valsorda (maintainer of Golang's crypto packages, among other things) published a summary yesterday [0] targeted at relative laypeople, with the same "we need to target 2029" bottom line.

0: https://words.filippo.io/crqc-timeline/

PUSH_AX 1 days ago [-]
Nothing has been broken yet, however data can be collected now and be cracked when the time comes, hence why there is a push.
thenewnewguy 1 days ago [-]
Can a theoretical strong enough quantum computer break PFS?
wahern 1 days ago [-]
QC breaks perfect forward secrecy schemes using non-PQC algorithms, same as for non-PFS. PFS schemes typically use single-use ephemeral DH/ECDH key pairs for symmetric key exchange, separate from the long-term signing keys for authentication.
ifwinterco 13 hours ago [-]
If you store a whole session of traffic from today you can break the key exchange with a quantum computer in the future.

AES probably can't be broken but that's irrelevant because in this scenario you have the key in plaintext from the key exchange

ankit_mishra 1 days ago [-]
[dead]
matt123456789 17 hours ago [-]
Still theory
bwesterb 8 hours ago [-]
When it's real, it's too late.
moi2388 1 days ago [-]
Theory. And afaik there are still questions as to if the PQ algorithms are actually secure.
mswphd 24 hours ago [-]
there are no meaningful questions. The only way there are meaningful questions is if you think global cryptographers + governments are part of a cabal to build insecure schemes. The new schemes use

1. cryptography developed across the world, 2. the actual schemes were overwhelmingly by European authors 3. standardized by the US 4. other countries standardizations have been substantially similar (e.g. the ongoing Korean one, the German BSI's recommendations. China's CACR [had one with substantially similar schemes](https://www.sdxcentral.com/analysis/china-russia-to-adopt-sl...). Note that this is separate from a "standardization", which sounds like it is starting soon).

In particular, given that China + the US ended up with (essentially the same) underlying math, you'd have to have a very weird hypothetical scenario for the conclusion to not be "these seem secure", and instead "there is a global cabal pushing insecure schemes".

tptacek 1 days ago [-]
There are not in fact meaningful questions about whether the settled-on PQC constructions are secure, in the sense of "within the bounds of our current understanding of QC".
ls612 1 days ago [-]
Didn't one of the PQC candidates get found to have a fatal classical vulnerability? Are we confident we won't find any future oopsies like that with the current PQC candidates?
tptacek 1 days ago [-]
The whole point of the competition is to see if anybody can cryptanalyze the contestants. I think part of what's happening here is that people have put all PQC constructions in bucket, as if they shared an underlying technology or theory, so that a break in one calls all of them into question. That is in fact not at all the case. PQC is not a "kind" of cryptography. It's a functional attribute of many different kinds of cryptography.

The algorithm everyone tends to be thinking of when they bring this up has literally nothing to do with any cryptography used anywhere ever; it was wildly novel, and it was interesting only because it (1) had really nice ergonomics and (2) failed spectacularly.

wahern 21 hours ago [-]
SIKE made it all the way to round 3. It failed spectacularly, but it happened rather abruptly. In one sense it wasn't surprising because of its novelty, but the actual attack was somewhat surprising--nobody was predicting it would crumble so thoroughly so quickly. Notably, the approach undergirding it is still thought secure; it was the particular details that caused it to fail.

It's hubris to say there are no questions, especially for key exchange. The general classes of mathematical problems for PQC seem robust, but that's generally not how crypto systems fail. They fail in the details, both algorithmically and in implementation gotchas.

From a security engineering perspective, there's no persuasive reason to avoid general adoption of, e.g., the NIST selections and related approaches. But when people suggest not to use hybrid schemes because the PQC selections are clearly robust on their own, well then reasonable people can disagree. Because, again, the devil is in the details.

The need to proclaim "no questions" feels more like a reaction to lay skepticism and potential FUD, for fear it will slow the adoption of PQC. But that's a social issue, and imbibing that urge may cause security engineers to let their guard down.

tptacek 19 hours ago [-]
What's your point? SIKE has literally nothing to do with MLKEM. There is no relationship between the algorithms. Essentially everybody working on PQC, including Bernstein himself, have converged on lattices, which, again, were a competitor to curves as a successor to RSA --- they are old.

SIKE: not lattices. Literally moon math. Do you understand how SIKE/SIDH works? It's fucking wild.

I'm going to keep saying this: you know the discussion is fully off the rails when people bring SIKE/SIDH into it as evidence against MLKEM.

wahern 18 hours ago [-]
You may not have any questions about the security of ML-KEM, but many people do. See, for example, DJB's compilation of such doubts from the IETF WG: https://blog.cr.yp.to/20260221-structure.html

DJB himself seems to prefer hybrid over non-hybrid precisely over concern about the unknowns: https://blog.cr.yp.to/20260219-obaa.html

These doubts may not be the kind curious onlookers have in mind, but to say there are no doubts among researchers and practitioners is a misrepresentation. In fact, you're flatly contradicting what DJB has said on the matter:

> SIKE is not an isolated example: https://cr.yp.to/papers.html#qrcsp shows that 48% of the 69 round-1 submissions to the NIST competition have been broken by now.

https://archive.cr.yp.to/2026-02-21/18:04:14/o2UJA4Um1j0ursy...

Unqualified assurances is what you hear from a salesman. You're trying to sell people on PQC. There's no reason to believe ML-KEM is a lemon, but you're effectively saying, "it's the last KEX scheme we'll ever need", and that's just not honest from an engineering point of view, even if it's what people need to hear.

tptacek 16 hours ago [-]
I think you just gave away the game. To the extent I believe a CRQC is imminent, I suppose I am "trying to sell people on PQC". But then, so is Daniel Bernstein, your only cryptographically authoritative cite to your concern. Bernstein's problem isn't that we're rushing to PQC. It's that we didn't pick his personal lattice proposal.

And, if we're on the subject of how trustworthy Bernstein's concerns are, I'll note again: in his own writing about the potential frailty of MLKEM, he cites SIKE, because, again, he thinks you're too dumb to understand the difference between a module lattice and a generic lattice.

Finally, I'm going to keep saying this until I don't have to say it anymore: PQC is not a "kind" of cryptography. It doesn't mean anything that N% of the Round 1 submissions to the NIST PQC Contest were cryptanalyzed. Multivariate quadratic equation cryptography, supersingular isogeny cryptography, and F_2^128 code-based cryptography are not related to each other. The point of the contest was for that to happen.

ls612 23 hours ago [-]
Yeah I get that, what I am really asking is that I know in my field, I can quickly get a vibe as to whether certain new work is good or not so good, and where any bugaboos are likely to be. For those who know PQC like I know economics, do they believe at this point that the algorithms have been analyzed successfully to a level comparable to DH or RSA? Or is this really gonna be a rush job under the gun because we have no choice?
tptacek 23 hours ago [-]
Lattice cryptography was a contender alongside curves as a successor to RSA. It's not new. The specific lattice constructions we looked at during NIST PQC were new iterations on it, but so was Curve25519 when it was introduced. It's extremely not a rush job.

The elephant in the room in these conversations is Daniel Bernstein and the shade he has been casting on MLKEM for the last few years. The things I think you should remember about that particular elephant are (1) that he's cited SIDH as a reason to be suspicious of MLKEM, which indicates that he thinks you're an idiot, and (2) that he himself participated in the NIST PQC KEM contest with a lattice construction.

lmm 20 hours ago [-]
Bernstein's ego is at a level where he thinks most other people are idiots (not without some justification), that's been clear for decades. What are you hinting at?
tptacek 16 hours ago [-]
I'm not saying anything about his ego or trying to psychoanalyze him. I'm saying: he attempted to get a lattice scheme standardized under the NIST PQC contest, and now fiercely opposes the standard that was chosen instead.
cwillu 1 days ago [-]
It's the same situation with classical encryption. It's not uncommon for a candidate algorithm [to be discovered ] to be broken during the selection process.
sophacles 1 days ago [-]
tbf - since we still don't know if p != np, there are still questions about if the current algorithms are secure also.
draw_down 9 hours ago [-]
[dead]
moi2388 1 days ago [-]
Fair, but recently several PQ algorithms have been shown to in fact not be secure, with known attacks, so I wouldn’t equate them
tptacek 1 days ago [-]
Which PQ algorithms would you be referring to here?
nick238 1 days ago [-]
tptacek 1 days ago [-]
Why don't you go ahead and pick out the attacks in here that you think are relevant to this conversation? It can't be on me to do that, because obviously my subtext is that none of them are.
sophacles 1 days ago [-]
Interesting. I'd like to learn more about this - where can I find info about it?
mswphd 24 hours ago [-]
they're almost assuredly talking about two things (maybe 3 if they really know what they're talking about, but the third is something that people making this argument like to pretend doesn't exist).

1. the main "eye catching" attack was the [attack on SIDH](https://eprint.iacr.org/2022/975.pdf). it was very much a "thought to be entirely secure" to "broken in 5 minutes with a Sage (python variant) implementation" within ~1 week. Degradation from "thought to be (sub-)exp time" to "poly time". very bad.

2. the other main other "big break" was the [RAINBOW attack](https://eprint.iacr.org/2022/214.pdf). this was a big attack, but it did not break all parameter sets, e.g. it didn't suddenly reduce a problem from exp-time to poly-time. instead, it was a (large) speedup for existing attacks.

anyway, someone popular among some people in tech (the cryptographer Dan Bernstein) has been trying (successfully) to slow the PQC transition for ~10 years. His strategy throughout has been complaining that a very particular class of scheme ("structured LWE-based schemes") are suspect. He has had several complaints that have shifted throughout the years (galois automorphism structure for a while, then whatever his "spherical models" stuff was lmao). There have been no appreciable better attacks (nothing like the above) on them since then. But he still complains, saying that instead people should use

1. NTRU, a separate structured lattice scheme (that he coincidentally submitted a scheme for standardization with). Incidentally, it had [a very bad attack](https://eprint.iacr.org/2016/127) ~ 2016. Didn't kill PQC, but killed a broad class of other schemes (NTRU-based fully homomorphic encryption, at least using tensor-based multiplication)

2. McCliece, a scheme from the late 70s (that has horrendously large public keys --- people avoid it for a reason). He also submitted a version of this for standardization. It also had a [greatly improved attack recently](https://eprint.iacr.org/2024/1193).

Of course, none of those are relevant to improved attacks on the math behind ML-KEM (algebraically structured variants on ring LWE). there have been some progress on these, but not really. It's really just "shaving bits", e.g. going from 2^140 to 2^135 type things. The rainbow attack (of the first two, the "mild" one) reduced things by a factor ~2^50, which is clearly unacceptable.

Unfortunately, because adherents of Dan Bernstein will pop up, and start saying a bunch of stuff confidently that is much too annoying to refute, as they have no clue what the actual conversation is. So the conversation becomes

1. people who know things, who tend to not bother saying anything (with rare exceptions), and 2. people who parrot Dan's (very wrong at this point honestly, but they've shifted over time, so it's more of 'wrong' and 'unwilling to admit it was wrong') opinions.

the dynamic is similar to how when discussions of vaccines on the internet occur, many medical professionals may not bother engaging, so you'll get a bunch of insane anti-vax conspiracies spread.

tptacek 22 hours ago [-]
For whatever it's worth I think I cosign all of this.
sophacles 18 hours ago [-]
In the context of: a green username offering some salacious/conspiratorial things about djb around a topic I'm only a little familiar with... Its worth a lot. Its the difference between me writing it off as (at best) a poorly informed misunderstanding of a complex topic, and me choosing to spend some time learning more. Ty
tptacek 18 hours ago [-]
None of this is really salacious or conspiratorial. I don't know how big a deal the attacks they're citing are. But this is directionally mostly stuff I've heard from lots of cryptography engineers over the last couple years. I know the comment is off comparing attacks on classical NTRU to SNTRUP though!
sophacles 18 hours ago [-]
As someone way out of the loop on pqc, this bit:

> anyway, someone popular among some people in tech (the cryptographer Dan Bernstein) has been trying (successfully) to slow the PQC transition for ~10 years

Sounds enough like throwing shade to make me doubt it's value, in absence of other signals.

My point was your history of posting knowledgeably about security and cryptography provides the credibility for me to go do more reading about the stuff in mswphd's post.

tptacek 18 hours ago [-]
Oh, Bernstein is a vocal and relentless opponent of MLKEM. Both the industry and research cryptography have settled on MLKEM. That's the subtext. You could word it differently and more charitably, but I wouldn't.
sophacles 18 hours ago [-]
Ty for the info. This is interesting and provides a lot of things I can go down rabbit holes looking into.
bdeol22 11 hours ago [-]
2029 is plausible at Cloudflare's edge; the long tail is boring enterprise TLS configs someone last touched in 2017.
bwesterb 8 hours ago [-]
Yeah, it's rough. Important to understand now for each product / system what the business impact is if it's not upgraded in time.
bdeol22 7 hours ago [-]
Agreed. The migrations that stall are usually missing an explicit owner for each TLS surface, not missing algorithms. Business impact is the forcing function once you know who gets paged.
weightedreply 24 hours ago [-]
Any information on future CPU's with support for hardware accelerated PQC algorithms? Will all my old devices become slow when PQC is the norm and encrypted communication is no longer hardware accelerated?
MrRadar 24 hours ago [-]
Only the asymmetric portion of the cryptography (which is only used in the handshake) will need to use PQC algorithms. Symmetric crypto algorithms (AES/ChaCha20/SHA-*), which are used after the handshake, are not as badly affected by quantum computing so they're not being replaced in the immediate term. I'm pretty sure that general purpose CPUs do not have hardware acceleration for the asymmetric crypto anyways.
mswphd 24 hours ago [-]
you don't really need that tbh. you can get pretty good speedups using standard (vector) intrinsics. the new algorithms are (mostly) modular linear algebra (+ some concept of "noise").
TacticalCoder 23 hours ago [-]
Tangential question...

Seen that many are already moving to QC-resistant cryptography and that more are shifting by the day... I've got a question: what are the implications of quantum computers going to be if we consider that the entirety of cryptography will have moved to quantum-resistant cryptography?

In other words: I only ever read about quantum computing when it's to talk about breaking cryptography. But what if all cryptography moves to quantum-resistant scheme, all of it... Then what are the uses of quantum computing? Protein folding? Logistics?

Basically, so far, quantum computing research has the effect of many companies and projects adding quantum-resistant cryptographic schemes.

If, say, we've got a $10 million quantum computer that can break one 256 bit elliptic curve key in an hour... Great, EC is broken. But what if browsers, SSH, auth, etc. just about everything moves to PQ schemes...

Then what are those quantum computers useful for?

I understand that breaking even a single EC 256 bit key in a few hours on a $$$ machine is a very big deal.

But what else are they going to be useful for? For breaking ECC doesn't help humanity. It doesn't bring anything. It only destroys.

EDIT: for example I read stuff like: "Estimates are about three years to break a single 256 bit EC key on a 10 000 qbits quantum computer". What's a 10 000 qbits quantum computer going to be used for when everybody shall have moved to quantum-resistant algos?

snowwrestler 8 hours ago [-]
To start, I am NOT an expert on the underlying technologies. But I have some exposure to the topic at let’s say more like an ecosystem level.

There are tons of hypothesized applications for quantum computing based on the expectation it will provide better simulation of quantum effects for e.g. chemistry, and offer major speedups of highly parallel simulation problems like nuclear plasma or some things in finance. Easy to Google to learn more about these.

But keeping the focus squarely on the military and intelligence services, one answer to your question is that everyone is not going to switch to post-quantum cryptography instantaneously. It’s going to take a while, especially for a long tail of “infrastructure” type things like networking gear, “internet of things,” industrial sensors, etc. Things that national intelligence services might like to break into to enable breaking into other things.

Quantum breaks may also still succeed against stored encrypted data from before the switch to PQ. And for at least a couple decades, national intelligence services have been scaling up their storage resources. So they might have a “backlog” they can work through.

Finally, things don’t have to last forever. Everything the military / government builds has an expected lifespan, and it only has to be valuable during that life span. And risks can be rare but huge in national security. So if quantum code-breaking computers only help the NSA learn a few very important things for a limited time, that still might be “worth it” to them. Or if a quantum computer doesn’t break any important cryptography, but helps advance the engineering and enables better quantum computers in the future for other anpplications—again, still might be worth it.

phicoh 8 hours ago [-]
We can assume that organizations like NSA have collected a huge amount of traffic that is protected by RSA or EC. So they well have plenty of use for those quantum computers.
wofo 24 hours ago [-]
Does this mean we should be migrating our SSH keys to post-quantum crypto right now?
crote 22 hours ago [-]
OpenSSH has supported post-quantum key agreement since 2022, and since 10.1 (October 2025) you'll get a warning if your connection isn't using it. It doesn't require rotating your keys, just upgrading the software on both sides.

Post-quantum signatures will require rotating your keys, but that's less urgent.

tombert 1 days ago [-]
Outside of the PQ algorithms not being as thoroughly vetted as others, is there any negatives to shifting algorithms? Like even if someone were to prove that quantum computing is a dud, is there any reason why we shouldn't be using this stuff anyway?
mswphd 24 hours ago [-]
they are much more thoroughly vetted than other schemes. They're more thoroughly vetted than elliptic curves were before we deployed them. Much more vetted than RSA was ever.

Practically though, there are some downsides. Elliptic curves tend to have smaller ciphertexts/keys/signatures/so are better on bandwidth. If you do everything right with elliptic curves, we're also more confident in the hardness of the underlying problems (cf "generic group lower bounds", and other extensions of this model).

The new algorithms tend to be easier to implement (important, as a big source of practical insecurity is implementation issues. historically much more than the underlying assumption breaking). This isn't uniformly, e.g. I still think that the FN-DSA algorithm will have issues of this type, but ML-DSA and ML-KEM are fine. They're also easier to "specify", meaning it is much harder to accidentally choose a "weak" instance of them (in several senses. the "weak curve" attacks are not really possible. there isn't really a way to hide a NOBUS backdoor like there was for DUAL_EC_DRBG). They also tend to be faster.

MrRadar 1 days ago [-]
Post-quantum algorithms tend to be slower than existing elliptic curve algorithms and require more data to be exchanged to provide equivalent security against attacks run on non-quantum computers.
tombert 1 days ago [-]
Any idea how much slower? Like are we talking half the speed? A quarter? 1%?

Sorry, I'm just very out of the loop on some of this stuff and I'm trying to play a game of catchup.

MrRadar 1 days ago [-]
This page lists some figures for ML-KEM-768 (which is the PQ key exchange algorithm that's most widely deployed today): https://blog.cloudflare.com/pq-2025/#ml-kem-versus-x25519 This one is actually faster than X25519 (a highly optimized ECC algorithm) by about double but requires 1,184 bytes of data to be exchanged per keyshare vs 32 for X25519. In practice everyone today is using a hybrid algorithm (where you do both ECC and PQ in case the PQ algorithm has an undiscovered weakness) so an ECC+PQ key exchange will be strictly slower than an ECC-only key exchange.

This page lists some numbers for different PQ signature algorithms: https://blog.cloudflare.com/another-look-at-pq-signatures/#t... Right now the NIST has selected three different ones (ML-DSA, SLH-DSA, and Falcon a.k.a. FN-DSA) which each have different trade-offs.

SLH-DSA is slow and requires a large amount of data for signatures, however it's considered the most secure of the algorithms (since it's based on the well-understood security properties of symmetric hash algorithms) so it was selected primarily as a "backup" in case the other two algorithms are both broken (which may be possible as they're both based on the same mathematical structure).

ML-DSA and Falcon are both fairly fast (within an order of magnitude of Ed25519, the X25519 curve signature algorithm), but both require significantly larger keys (41x/28x) and signatures (38x/10x) compared to Ed25519. Falcon has the additional constraint that achieving the listed performance in that table requires a hardware FPU that implements IEEE-754 with constant-time double-precision math. CPUs that do not have such an FPU will need to fall back to software emulation of the required floating point math (most phone, desktop, and server CPUs have such an FPU but many embedded CPUs and microcontrollers do not).

The net result is that TLS handshakes with PQ signatures and key exchange may balloon to high single- or double-digit kilobytes in size, which will be especially impactful for users on marginal connections (and may break some "middle boxes" https://blog.cloudflare.com/nist-post-quantum-surprise/#dili...).

srdjanr 23 hours ago [-]
AFAIK, PQ certificates are significantly longer than current ones. I don't know exact numbers though.
ossianericson 1 days ago [-]
The CDN part is the easy half. In my work the harder problem has most often been internal service mesh, mTLS between services, any infra that doesn’t terminate at a CDN. Has a bad habit of longer certificate lifetimes and older TLS stacks, and nobody is upgrading it for you.
teaearlgraycold 20 hours ago [-]
Mullvad has PQ encryption available today. I recommend everyone use them, they're a 10/10 company.
nalekberov 22 hours ago [-]
Yet, the same Cloudflare wants to control entire internet traffic single-handedly.

The Internet was not created for this.

One could argue that 'but they are very good at preventing DDoS attacks' — yes they are; however, they have always loved control and kept their technology proprietary to lock their customers into their systems. And one day, a single line of code disrupted many services on the web.

Centralization and monopolies are much bigger threats to the future of the internet, IMHO. (Which always follows the same pattern: give your customers free or unbelievably cheaper services, even at a loss, lock them in, then jack up the price.)

20k 1 days ago [-]
Quantum computing, and the generic term 'quantum' is gearing up to be the next speculative investment hype bubble after AI, so prepare for a lot of these kinds of articles
Hasz 1 days ago [-]
nah. governments around the world are hoovering up traffic today with the hope of a "cheap" (by nation state standards) quantum computer. Some of the secrets sent today are "evergreen" (i.e are still relevant 10+ years into the future), amongst a whole lot of cruft. There is massive incentive to hide the technology to keep your peers transmitting in vulnerable encryption as long as possible.
nickspacek 1 days ago [-]
For sure, that or just ensuring they have laws in place that grant them access to the unencrypted data we are sending to CDNs operating in their jurisdiction (when necessary for national security reasons).
bwesterb 1 days ago [-]
At least it's time bound: hope to have this job done by 2029!
_2fnr 1 days ago [-]
[flagged]
jgrahamc 1 days ago [-]
Cloudflare has long been doing work on PQ (sometimes in conjunction with Google) and rolled out PQ encryption for our customers. You can read about where this all started for us 7 years back: https://blog.cloudflare.com/towards-post-quantum-cryptograph... and four years ago rolled out PQ encryption for all customers: https://blog.cloudflare.com/post-quantum-for-all/

The big change here is that we're going to roll out PQ authentication as well.

One important decision was to make this "included at no extra cost" with every plan. The last thing the Internet needs is blood-sucking parasites charging extra for this.

Sattyamjjain 1 days ago [-]
[flagged]
dfordp11 15 hours ago [-]
[dead]
valeriozen 1 days ago [-]
[flagged]
diarrhea 1 days ago [-]
coldpie 1 days ago [-]
I noticed this, too. valeriozen, can you explain what happened here?

Context, two nearly identical comments from different users.

hackerman70000 at 16:09 https://news.ycombinator.com/item?id=47677483 :

> Cloudflare pushing PQ by default is probably the single most impactful thing that can happen for adotpion. Most developers will never voluntarily migrate their TLS config. Making it the default at the CDN layer means millions of sites get upgraded without anyone making a decision

valeriozen at 16:17 https://news.ycombinator.com/item?id=47677615 :

> cloudflare making pq the default is the only way we get real adoption. most devs are never going to mess with their tls settings unless they absolutely have to. having it happen at the cdn level is the perfect silent upgrade for millions of sites without the owners needing to do anything

wmf 1 days ago [-]
They're using the same AI model?
heliumtera 1 days ago [-]
And that changes what?
bwesterb 1 days ago [-]
If we do our job, it changes nothing. Problem with security generally: no spectacle if it's all correct. :)
ezfe 1 days ago [-]
It would mean that they're future-proofing their security
ljhsiung 1 days ago [-]
"Nothing happened for y2k" energy
ls612 1 days ago [-]
The secrecy around this is precisely the opposite of what we saw in the 90s when it started to become clear DES needed to go. Yet another sign that the global powers are preparing for war.
tptacek 1 days ago [-]
What do you mean? For as long as I remember (back to late 1994) people understood DES to be inadequate; we used DES-EDE and IDEA (and later RC4) instead. What "secrecy" would there have been? The feasibility of breaking DES given a plausible budget goes all the way back to the late 1970s. The first prize given for demonstrating a DES break was only $10,000.
adrian_b 1 days ago [-]
Triple-key DES (DES-EDE) had already been proposed by IBM in 1979, in response to the criticism that the 56-bit keys of DES are far too short.

So practically immediately after DES was standardized, people realized that NSA had crippled it by limiting the key length to 56 bits, and they started to use workarounds.

Before introducing RC2 and RC4 in 1987, Ronald Rivest had used since 1984 another method of extending the key length of DES, named DESX, which was cheaper than DES-EDE as it used a single block cipher function invocation. However, like also RC4, DESX was kept as a RSA trade secret, until it was leaked, also like RC4, during the mid nineties.

IDEA (1992, after a preliminary version was published in 1991) was the first block cipher function that was more secure than DES and which was also publicly described.

ls612 1 days ago [-]
People were willing to explicitly explain why it was inadequate rather than keep it secret. That is the difference.
tptacek 1 days ago [-]
What was to explain? It had a 56-bit key.
ls612 1 days ago [-]
Was that the only thing wrong with it? The 90s was definitely before my time but I was under the impression reading about it that there were also fundamental flaws with DES which lead to the competition which ultimately produced AES.
tptacek 1 days ago [-]
Yes, that was what was wrong with DES. I mean, it also had an 8-byte block size, which turns out to be inadequate as well, but that's true of IDEA and Blowfish as well.
NitpickLawyer 1 days ago [-]
My read of the recent google blog post is that they framed it as cryptocurrency related stuff just so they don't say the silent thing out loud. But lots of people "in the know" / working on this are taking it much more seriously than just cryptobros go broke. So my hunch is that there's more to it and they didn't want to say it / couldn't / weren't allowed to.
adrian_b 1 days ago [-]
It should be noted that quantum computers are a threat mainly for interactions between unrelated parties which perform legal activities, e.g. online shopping, online banking, notarized legal documents that use long-term digital signatures.

Quantum computers are not a threat for spies or for communications within private organizations where security is considered very important, where the use of public-key cryptography can easily be completely avoided and authentication and session key exchanges can be handled with pre-shared secret keys used only for that purpose.

dadrian 1 days ago [-]
I will bring this up at the next meeting of the secret cryptographer cabal where we decide what information to reveal to non-cryptographers.
IncreasePosts 1 days ago [-]
What is "it" that you're referring to?
wil421 1 days ago [-]
> mitigating harvest-now/decrypt-later attacks.

Most likely the NSA or someone else is ahead of the game and already has a quantum computer. If the tech news rumors are to true the NSA has a facility in Utah that can gather large swaths of the internet and process the data.

bookofjoe 1 days ago [-]
tonfa 1 days ago [-]
FYI this is a parody website. (in case it's not obvious)
bookofjoe 1 days ago [-]
It wasn't obvious to me!
Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
Rendered at 20:30:27 GMT+0000 (Coordinated Universal Time) with Vercel.