Quantum Computing Breakthrough Sparks Post-Quantum Cryptography Urgency
Shor's Algorithm successfully breaking a 5-bit elliptic curve key on a quantum computer is a significant milestone, indicating the accelerating threat quantum computing poses to current encryption standards. This creates immediate and massive opportunities for cybersecurity firms specializing in post-quantum cryptography (PQC) solutions. Businesses, governments, and individuals will soon need to invest heavily in PQC research, development, and implementation to secure their data against future quantum attacks.
Origin Reddit Post
r/technology
Shor’s Algorithm Breaks 5-bit Elliptic Curve Key on 133-Qubit Quantum Computer
Posted by u/upyoars•07/21/2025
Top Comments
u/xdeltax97
Now try SHA 256 and then I’ll be impressed.
u/SnowedOutMT
Right? 4 years ago AI was putting out janky images that sort of resembled what you were asking it for. It was a "gimmick" and people minimized it the same way. Today it's baked into everythin
u/Bokbreath
source plz ?
u/caedin8
You can do the exact same with a traditional algorithm.
The probability starts with a uniform distribution of 1/32, then you try a key and the probability of the others collapses to 0 or 1/3
u/LXicon
That could be said for an algorithm that just picks answers at random: *"if the result is wrong you just keep running it till you get the right answer."*
u/Trick_Procedure8541
the post 3 years for now will be epic
u/Trick_Procedure8541
definitely. there’s nothing other than noise from heron
u/DoughnutCurious856
correct. But if you have a higher probability than random at getting the right answer, it will take many fewer iterations.
u/LXicon
That could be said for an algorithm that just picks answers at random: *"if the result is wrong you just keep running it till you get the right answer."*
u/old_bugger
Recently broke a key on my computer too. Gves me the shts.
u/loptr
One of the mayor achievements is that they managed to maintain coherence through 67000 layers of operations. It makes me doubt you have any understanding of quantum computing, the challenges
u/davispw
This is a single transistor compared to today’s hundred-billion-transistor microprocessors. Yet the transistor itself was still a revolutionary invention.
u/yoshiatsu
Bitcoin wallets use ECC.
u/bakgwailo
>Traditional computing sounds like it's still winning, but it's a cool proof of concept on quantum computing.
Shor's algorithm allows factoring of integers in polynomial time. It is signi
u/Trick_Procedure8541
For each of that go to google/ google images and add roadmap to your search query.
5 years ago they were projecting 2035 to get there but everyone has shifted forward by several years
and a
u/ericonr
Why would you trust roadmaps from hype based companies?
u/bakgwailo
>Traditional computing sounds like it's still winning, but it's a cool proof of concept on quantum computing.
Shor's algorithm allows factoring of integers in polynomial time. It is signi
u/tyler1128
Based on my understanding of quantum computing, because of decoherence and the general probabilistic nature of wavefunctions, algorithms are often run many times with the most common result o
u/sweetno
But wouldn't AI at this point be able to solve this without this completely incomprehensible steam punk machinery?
u/jcunews1
Duh, anyone can easily break 5-bits cipher with their two hands.
u/troelsbjerre
Using experimental hardware, worth so much that IBM won't even quote you a price, they are able to pick the right key out of 32 possible keys, using 100 samples.
u/bakgwailo
>Traditional computing sounds like it's still winning, but it's a cool proof of concept on quantum computing.
Shor's algorithm allows factoring of integers in polynomial time. It is signi
u/jcunews1
Duh, anyone can easily break 5-bits cipher with their two hands.
u/brodogus
Yeah, quantum computers are useless at this scale. But they scale exponentially for some problems. The quantum computer would be way faster if there were 2^2048 possibilities (though it would
u/SethBling
Sure. Then in 18 months they double the number of qbits and can pick the right key out of 1024 possible keys. Then in another 18 months out of 1,048,576 possible keys. And then in another 18
u/BNeutral
And output positive continuous nuclear fusion is just 10 years away, right? Nobody can predict shit
u/Bokbreath
>Then in 18 months they double the number of qbits
That seems highly doubtful. do you have any evidence to support this ?
u/sloblow
Dumb noob question: how does the computer know when the key is broken?
u/brodogus
Yeah, quantum computers are useless at this scale. But they scale exponentially for some problems. The quantum computer would be way faster if there were 2^2048 possibilities (though it would
u/Bokbreath
you made the claim, you post the results - which you of course can't because nobody has made such an absurd claim.
u/TheFeshy
This is basically like doing that, but using a billion dollars and a lot of research to weight the dice.
u/Bokbreath
you made the claim, you post the results - which you of course can't because nobody has made such an absurd claim.
u/nicuramar
> Sure. Then in 18 months they double the number of qbits
That hasn’t really been the case so far.
> Hope no one is currently transmitting anything using elliptic curve keys that they
u/Trick_Procedure8541
IBM, IONQ, Quantinuum, Psi quantum, quera, atom, pascal, to name a few
u/loptr
One of the mayor achievements is that they managed to maintain coherence through 67000 layers of operations. It makes me doubt you have any understanding of quantum computing, the challenges
u/SnowedOutMT
Right? 4 years ago AI was putting out janky images that sort of resembled what you were asking it for. It was a "gimmick" and people minimized it the same way. Today it's baked into everythin
u/brodogus
I think it produces a probability distribution of all possible answers, which gets more and more skewed to the correct answer as you run the operations over and over. Then you need to perform
u/brodogus
Yeah, quantum computers are useless at this scale. But they scale exponentially for some problems. The quantum computer would be way faster if there were 2^2048 possibilities (though it would
u/Trick_Procedure8541
the post 3 years for now will be epic
u/nicuramar
Yeah the article is weird in several ways like that.
u/CanvasFanatic
A childlike misapprehension of Moore’s Law as a universal force.
u/nicuramar
> Sure. Then in 18 months they double the number of qbits
That hasn’t really been the case so far.
> Hope no one is currently transmitting anything using elliptic curve keys that they
u/caughtinthought
Lol that's not how things work in quantum. They ain't doubling shit
u/Trick_Procedure8541
Naw dude the world has changed people’s timelines have moved up where fault tolerance is expected 5 years sooner now
ibm/oxford ionic projects 8,000 fault tolerant qubits in 2029
[https
u/Bokbreath
source plz ?
u/dusk534
*clears throat* huh?
u/Kitchen-Agent-2033
Soon be doing better than 1943, when 100 women (“calculators”) would use the same math processes to sample, and filter which ones to then give to the (fewer) prodigy with next level cryptogra
u/Kitchen-Agent-2033
Back to being fodder for “evidence of morality failings” in today’s immigration world.
Just needs the wrong folk in charge.
Never forget Henry VIII exterminated a larger percentage of his p
u/Code4Reddit
True. But since answers are easily verified by a traditional computer, if the result is wrong you just keep running it till you get the right answer. So the algorithm can still be used to pro
u/Kitchen-Agent-2033
Back to being fodder for “evidence of morality failings” in today’s immigration world.
Just needs the wrong folk in charge.
Never forget Henry VIII exterminated a larger percentage of his p
u/caedin8
You can do the exact same with a traditional algorithm.
The probability starts with a uniform distribution of 1/32, then you try a key and the probability of the others collapses to 0 or 1/3
u/ericonr
Why would you trust roadmaps from hype based companies?
u/LXicon
That could be said for an algorithm that just picks answers at random: *"if the result is wrong you just keep running it till you get the right answer."*
u/Such_Introduction592
Shor's Bones!
u/sweetno
But wouldn't AI at this point be able to solve this without this completely incomprehensible steam punk machinery?
u/SnowedOutMT
Right? 4 years ago AI was putting out janky images that sort of resembled what you were asking it for. It was a "gimmick" and people minimized it the same way. Today it's baked into everythin
u/caedin8
Couldn’t a traditional algorithm solve this in worst case 32 candidates? I mean there is 32 options so if you try them all you’d be done.
Why did the quantum computer need 100 tries to pick
u/Trick_Procedure8541
[https://github.com/Strilanc/falling-with-style](https://github.com/Strilanc/falling-with-style)
u/brodogus
I think it produces a probability distribution of all possible answers, which gets more and more skewed to the correct answer as you run the operations over and over. Then you need to perform
u/troelsbjerre
Using experimental hardware, worth so much that IBM won't even quote you a price, they are able to pick the right key out of 32 possible keys, using 100 samples.
u/Trick_Procedure8541
[https://github.com/Strilanc/falling-with-style](https://github.com/Strilanc/falling-with-style)
u/loptr
One of the mayor achievements is that they managed to maintain coherence through 67000 layers of operations. It makes me doubt you have any understanding of quantum computing, the challenges
u/aDutchofMuch
Could you describe how we plan to sound the coherence problem in rhyme, for more digestible understanding?
u/bakgwailo
Except doing it as random doesn't improve time complexity, which Shor's most certainly does.
u/troelsbjerre
Using experimental hardware, worth so much that IBM won't even quote you a price, they are able to pick the right key out of 32 possible keys, using 100 samples.
u/bakgwailo
Except doing it as random doesn't improve time complexity, which Shor's most certainly does.
u/knightress_oxhide
One hand to count and one hand to write the answer.
u/Trick_Procedure8541
[https://github.com/Strilanc/falling-with-style](https://github.com/Strilanc/falling-with-style)
u/Trick_Procedure8541
look at roadmaps for QC companies
four years from now half a dozen well funded players are expecting 1000 fault tolerant qubits in production around 2029 and the ability to keep scaling i
u/SludyAcorn
With ECC operations, it is deterministic in nature. Meaning if you enter in your private scalar of let’s say 0xF it will equal its public key (the answer) everytime. Now on ECC operations. Yo
u/tyler1128
Based on my understanding of quantum computing, because of decoherence and the general probabilistic nature of wavefunctions, algorithms are often run many times with the most common result o
u/xdeltax97
Now try SHA 256 and then I’ll be impressed.
u/crozone
Yes, but so far there is no quantum computer running Shor's that can practically best a classical computer at the same task.
u/caedin8
Couldn’t a traditional algorithm solve this in worst case 32 candidates? I mean there is 32 options so if you try them all you’d be done.
Why did the quantum computer need 100 tries to pick
u/Trick_Procedure8541
definitely. there’s nothing other than noise from heron
u/ericonr
Why would you trust roadmaps from hype based companies?
u/CanvasFanatic
A childlike misapprehension of Moore’s Law as a universal force.
u/SethBling
Sure. Then in 18 months they double the number of qbits and can pick the right key out of 1024 possible keys. Then in another 18 months out of 1,048,576 possible keys. And then in another 18
u/troelsbjerre
... managed to maintain ***sufficient*** coherence through 67000 layers to solve the trivial toy problem. This implies that it would fall apart with additional layers. And notice the phrasing
u/caedin8
You can do the exact same with a traditional algorithm.
The probability starts with a uniform distribution of 1/32, then you try a key and the probability of the others collapses to 0 or 1/3
u/Trick_Procedure8541
the post 3 years for now will be epic
u/jcunews1
Duh, anyone can easily break 5-bits cipher with their two hands.
u/Trick_Procedure8541
For each of that go to google/ google images and add roadmap to your search query.
5 years ago they were projecting 2035 to get there but everyone has shifted forward by several years
and a
u/SethBling
Sure. Then in 18 months they double the number of qbits and can pick the right key out of 1024 possible keys. Then in another 18 months out of 1,048,576 possible keys. And then in another 18
u/crozone
Yes, but so far there is no quantum computer running Shor's that can practically best a classical computer at the same task.
u/Tyilo
Reminds me of https://github.com/Strilanc/falling-with-style
u/Bokbreath
>Then in 18 months they double the number of qbits
That seems highly doubtful. do you have any evidence to support this ?
u/sloblow
Dumb noob question: how does the computer know when the key is broken?
u/Trick_Procedure8541
IBM, IONQ, Quantinuum, Psi quantum, quera, atom, pascal, to name a few
u/troelsbjerre
... managed to maintain ***sufficient*** coherence through 67000 layers to solve the trivial toy problem. This implies that it would fall apart with additional layers. And notice the phrasing
u/madprgmr
Yes, it has the _potential_ to scale up, unlike factoring on traditional computing... but a 5-bit key is trivial to brute force. Like, you could even do it with pen and paper in under an hour
u/Fheredin
I am... somewhat skeptical of that. Moore's Law became a thing because we were gaining all purpose computational power. Qbits right now are mostly monotaskers for breaking EC cryptography.
I
u/tyler1128
Based on my understanding of quantum computing, because of decoherence and the general probabilistic nature of wavefunctions, algorithms are often run many times with the most common result o
u/Code4Reddit
True. But since answers are easily verified by a traditional computer, if the result is wrong you just keep running it till you get the right answer. So the algorithm can still be used to pro
u/Trick_Procedure8541
look at roadmaps for QC companies
four years from now half a dozen well funded players are expecting 1000 fault tolerant qubits in production around 2029 and the ability to keep scaling i
u/Puny-Earthling
Yeah. Minimising this is incredibly shortsighted. Hope you all don’t have some sordid secrets that will come to light when RSA an EC inevitably get cracked.
u/nicuramar
Yeah the article is weird in several ways like that.
u/madprgmr
EC cryptography hasn't been considered quantum-safe for at least a decade as it's been known to be weak to this particular algorithm.
This article also makes me wonder what's going on, as it
u/Fheredin
I am... somewhat skeptical of that. Moore's Law became a thing because we were gaining all purpose computational power. Qbits right now are mostly monotaskers for breaking EC cryptography.
I
u/madprgmr
EC cryptography hasn't been considered quantum-safe for at least a decade as it's been known to be weak to this particular algorithm.
This article also makes me wonder what's going on, as it
u/yoshiatsu
Bitcoin wallets use ECC.
u/old_bugger
Recently broke a key on my computer too. Gves me the shts.
u/Trick_Procedure8541
definitely. there’s nothing other than noise from heron
u/Kitchen-Agent-2033
In 1945, no one used the american troops code-making tools in the field for any thing more than todays sensitive data (attack hill 3, at 4pm)
Decoding useless facts is useless (unless traini
u/madprgmr
EC cryptography hasn't been considered quantum-safe for at least a decade as it's been known to be weak to this particular algorithm.
This article also makes me wonder what's going on, as it
u/Trick_Procedure8541
in three words: they’ve solved fidelity
For some there’s clearly unresolved problems — like 2ms gate times and 1s coherence. Or photonic interconnects for 2 qubit chips with 10% path loss u
u/serg06
Forget bogosort, now we have quantum bogosort.
u/knightress_oxhide
One hand to count and one hand to write the answer.
u/Fheredin
I am... somewhat skeptical of that. Moore's Law became a thing because we were gaining all purpose computational power. Qbits right now are mostly monotaskers for breaking EC cryptography.
I
u/TheFeshy
This is basically like doing that, but using a billion dollars and a lot of research to weight the dice.
u/Kitchen-Agent-2033
In 1945, no one used the american troops code-making tools in the field for any thing more than todays sensitive data (attack hill 3, at 4pm)
Decoding useless facts is useless (unless traini
u/Puny-Earthling
Yeah. Minimising this is incredibly shortsighted. Hope you all don’t have some sordid secrets that will come to light when RSA an EC inevitably get cracked.
u/BNeutral
And output positive continuous nuclear fusion is just 10 years away, right? Nobody can predict shit
u/Tyilo
Reminds me of https://github.com/Strilanc/falling-with-style
u/dusk534
*clears throat* huh?
u/davispw
This is a single transistor compared to today’s hundred-billion-transistor microprocessors. Yet the transistor itself was still a revolutionary invention.
u/Bokbreath
I meant published sources, not names of companies. give me their public statements that assert the claim.
u/DoughnutCurious856
correct. But if you have a higher probability than random at getting the right answer, it will take many fewer iterations.
u/dusk534
*clears throat* huh?
u/bothering
With the way our rights are going these past few years, anything that is outside “I’m straight and white and I love America and Amazon” would be considered a sordid secret
u/Trick_Procedure8541
in three words: they’ve solved fidelity
For some there’s clearly unresolved problems — like 2ms gate times and 1s coherence. Or photonic interconnects for 2 qubit chips with 10% path loss u
u/bothering
With the way our rights are going these past few years, anything that is outside “I’m straight and white and I love America and Amazon” would be considered a sordid secret
u/brodogus
I think it produces a probability distribution of all possible answers, which gets more and more skewed to the correct answer as you run the operations over and over. Then you need to perform
u/DoughnutCurious856
correct. But if you have a higher probability than random at getting the right answer, it will take many fewer iterations.
u/aDutchofMuch
Could you describe how we plan to sound the coherence problem in rhyme, for more digestible understanding?
u/BNeutral
And output positive continuous nuclear fusion is just 10 years away, right? Nobody can predict shit
u/Puny-Earthling
Yeah. Minimising this is incredibly shortsighted. Hope you all don’t have some sordid secrets that will come to light when RSA an EC inevitably get cracked.
u/aDutchofMuch
Could you describe how we plan to sound the coherence problem in rhyme, for more digestible understanding?
u/Kitchen-Agent-2033
Soon be doing better than 1943, when 100 women (“calculators”) would use the same math processes to sample, and filter which ones to then give to the (fewer) prodigy with next level cryptogra
u/Trick_Procedure8541
look at roadmaps for QC companies
four years from now half a dozen well funded players are expecting 1000 fault tolerant qubits in production around 2029 and the ability to keep scaling i
u/old_bugger
Recently broke a key on my computer too. Gves me the shts.
u/Such_Introduction592
Shor's Bones!
u/madprgmr
Yes, it has the _potential_ to scale up, unlike factoring on traditional computing... but a 5-bit key is trivial to brute force. Like, you could even do it with pen and paper in under an hour
u/TheFeshy
This is basically like doing that, but using a billion dollars and a lot of research to weight the dice.
u/caedin8
Couldn’t a traditional algorithm solve this in worst case 32 candidates? I mean there is 32 options so if you try them all you’d be done.
Why did the quantum computer need 100 tries to pick
u/Trick_Procedure8541
in three words: they’ve solved fidelity
For some there’s clearly unresolved problems — like 2ms gate times and 1s coherence. Or photonic interconnects for 2 qubit chips with 10% path loss u
u/nicuramar
> Sure. Then in 18 months they double the number of qbits
That hasn’t really been the case so far.
> Hope no one is currently transmitting anything using elliptic curve keys that they
u/serg06
Forget bogosort, now we have quantum bogosort.
u/Such_Introduction592
Shor's Bones!
u/bakgwailo
Except doing it as random doesn't improve time complexity, which Shor's most certainly does.
u/Bokbreath
I meant published sources, not names of companies. give me their public statements that assert the claim.
u/Trick_Procedure8541
Naw dude the world has changed people’s timelines have moved up where fault tolerance is expected 5 years sooner now
ibm/oxford ionic projects 8,000 fault tolerant qubits in 2029
[https
u/Bokbreath
>Then in 18 months they double the number of qbits
That seems highly doubtful. do you have any evidence to support this ?
u/Tyilo
Reminds me of https://github.com/Strilanc/falling-with-style
u/nicuramar
Yeah the article is weird in several ways like that.
u/SludyAcorn
With ECC operations, it is deterministic in nature. Meaning if you enter in your private scalar of let’s say 0xF it will equal its public key (the answer) everytime. Now on ECC operations. Yo
u/caughtinthought
Lol that's not how things work in quantum. They ain't doubling shit
u/Code4Reddit
True. But since answers are easily verified by a traditional computer, if the result is wrong you just keep running it till you get the right answer. So the algorithm can still be used to pro
u/CanvasFanatic
A childlike misapprehension of Moore’s Law as a universal force.