I sat through a briefing last week about quantum encryption and the threat that quantum computing poses to encryption in use today. It was stressed that nation states are hoovering up encrypted data now in order to decrypt later with quantum computing. Much the same way America decrypted old soviet encrypted data. I wonder if it will take as long and if anyone will still be alive to make use of that data.
As has been previously pointed out, the 2001 and 2012 quantum factorisation records may be easily matched with a dog trained to bark three times [33]. We verified this by taking a recently-calibrated reference dog, Scribble, depicted in Figure 6, and having him bark three times, thus simultaneously factorising both 15 and 21. This process wasn’t as simple as it first appeared because Scribble is very well behaved and almost never barks. Having him perform the quantum factorisation required having his owner play with him with a ball in order to encourage him to bark. It was a special performance just for this publication, because he understands the importance of evidence-based science.
> we also estimate that factorising at least two-digit numbers should be within most
dogs’ capabilities, assuming the neighbours don’t start complaining first
> Similarly, we refer to an abacus as “an abacus” rather than a digital computer, despite the fact that it relies on digital manipulation to effect its computations.
Like LLMs, this isn't the sort of thing where a small group would make a sudden advancement and be made secret, and I doubt that the NSA can make theirs significantly faster than any industry team today. I think more likely you would need to get worried if someone got one scalable to hundreds or thousands of logical qubits and then stopped publishing.
> I think more likely you would need to get worried if someone got one scalable to hundreds or thousands of logical qubits and then stopped publishing.
Consider the likelihood of managing that without alerting the authorities to what is going on.
This shouldn't be a major issue because of Forward Secrecy (https://en.wikipedia.org/wiki/Forward_secrecy) principles built into modern TLS protocols, which ensure that even if the public/private key scheme is vulnerable to (for example) quantum attacks, the attacks have to be done now, as a MITM for the handshake, or otherwise the full traffic capture is useless for future decryption without getting some secrets from one of the endpoints.
That being said, it's not 100% used everywhere yet (Wikipedia mentions 92.6% of websites), and various means of tricking devices into downgrading to an older protocol would result in traffic that might be decrypted later.
No, this absolutely is not how forward secrecy works in TLS. Forward secrecy protects against a break in the signature algorithm, but not in the key agreement algorithms.
Both the FFDH and ECDH key agreement algorithms are vulnerable to quantum crypt-analysis; someone capturing traffic today could later break that agreement and then decrypt the data. An attacker would have to capture the entire session up to the "point of interest" though.
This is why FFDH/ECDH are being augmented with Post-Quantum secure KEMs.
What I want to know is how they guess which 0.001% of signals or internet traffic is actually worthwhile to keep? The biggest nation states could conceivably store about 1 year’s worth of internet traffic right now, but then they also need to store whatever other signals intelligence they’re gathering for analysis, so it will be less than a single years worth.
But almost all that data is going to turn out to be useless if or when they gain quantum ability to decrypt it, and even the stuff that could be useful now gets less useful with every month it stays encrypted. Stuff that is very useful intelligence now could be absolutely useless in five years…
If you discard all major video streaming sites (including adult entertainment) then you probably can get most of the way there; you're probably mostly interested in text communication and actual user data, not the video content which is so much larger than that.
It's an interesting little nugget of evidence in favor of the simulation hypothesis. We're currently living through the first era in humanity's history where there will be enough raw information to create a workable global level simulation once that data is decrypted. Pair that with the fact that we're living through such a huge inflection point in history (birth of the internet, humanity becoming a multiplanetary species, and more) and you have a time where people both (1) can and (2) will want to simulate/experience. It's quite interesting.
I'm still convinced that the simulation hypothesis is just religion for the atheist or agnostic, because if it turns out that it's correct and one day you 'wake up' only to find that it was all a simulation, well how do you know that isn't now also just another simulation? It's a non-theory. But I find this some quite compelling circumstantial evidence in favor of this non-theory. Because an arbitrary number of individuals may be able to experience "this" era throughout our species' future, yet only one group will be the one that gets to actually live it, and that group will ostensibly be orders of magnitude smaller than the sum total of all that will later 'experience' it. Statistically you're rather more likely to belong to the simulation group than the real, if these assumptions are correct.
> The solution was reached by using codebreaking software the team had developed along with extensive manual work, in part required because Perwich had mistakenly omitted a couple of letters in his ciphertext.
That explains how the team of 3 codebreakers got it, but what about the other codebreaker, Matthew Brown, who figured it out by himself? The article doesn't say anything about his approach. Seems impressive if he can match the effort of three cryptographers using their own custom software. I want to read more about him!
It's fascinating to me that the keywords were further encoded such that even if the message was deciphered, the strategic plans could not be acted upon.
It was quite common to have a codebook that might list several numbers or words used to substitute for places, individuals, actions, etc.
These also existed for corporate entities. A concern might have their own codebook such that the telegraph office would not be privy to their internal business.
They would also use codebooks as a type of compression, since the telegraph company charged less for sending English words as opposed to enciphered characters, and obviously, there are many uncommon words that could substitute for longer common phrases.
Spying was a profession 400 years ago due to the intense rivalry of France and England. Vast sums and amounts of time were spent on various schemes. Probably for revenge for losing western Europe in the 100 years war. Religion seems to have been mostly a cover for plunder and stealing treasure. France remained mostly Catholic, however the religious turnover in England created interesting dynamics.
The proprietor of Maryland landed the title Baron Baltimore through friendship with the spymaster to Queen Elizabeth, "whom Calvert had met during an extended trip to the European mainland between 1601 and 1603".
"He also held the title of Earl of Salisbury in 1605 and Lord High Treasurer in 1608, making him the most powerful man at the royal court."
The Calvert family lost the Maryland estate after the armed insurrection of 1689, however the Crown returned the estate to the Calvert family in 1715 "after Charles Calvert, 5th Baron Baltimore, declared in public that he was a Protestant."
The final "23rd Governor of Restored Proprietary Government" Sir Robert Eden, 1st Baronet, was the great great grandfather of Prime Minister Anthony Eden.
The article mentions a code breaking software - any idea what that is? I have a coded letter that is about 250 years old, but it is written in Devanagari script so letter frequency isn't straightforward. Please do suggest any tools that can help in decoding.
It amazing to read cyphertext of events so commonly portrayed in historical dramas. One always assumes huge creative license and a degree of historical bias or other error, then one gets a text message from the past confirming everything on screen is in fact accurate.
So many machninations amongst the elites even hundreds of years ago. And imagine today, with the current set of goons in power. It's amazing they've survived for so long.
I sat through a briefing last week about quantum encryption and the threat that quantum computing poses to encryption in use today. It was stressed that nation states are hoovering up encrypted data now in order to decrypt later with quantum computing. Much the same way America decrypted old soviet encrypted data. I wonder if it will take as long and if anyone will still be alive to make use of that data.
If quantum computing would progress just like in the last 30 years it may take 300 years before it can be useful.
https://eprint.iacr.org/2025/1237.pdf
> we also estimate that factorising at least two-digit numbers should be within most dogs’ capabilities, assuming the neighbours don’t start complaining first
> Similarly, we refer to an abacus as “an abacus” rather than a digital computer, despite the fact that it relies on digital manipulation to effect its computations.
I loved this quote as well
This deserves an Ig Nobel Prize lol.
Ig Nobel’s go to actual research, not to satire.
If you know a better way to factor 35, I’d like to hear it.
If anyone had made meaningful progress on QC in that time there is no way knowledge of it would be allowed to be public.
It is one of those domains where success would land you in a gilded prison.
Like LLMs, this isn't the sort of thing where a small group would make a sudden advancement and be made secret, and I doubt that the NSA can make theirs significantly faster than any industry team today. I think more likely you would need to get worried if someone got one scalable to hundreds or thousands of logical qubits and then stopped publishing.
> I think more likely you would need to get worried if someone got one scalable to hundreds or thousands of logical qubits and then stopped publishing.
Consider the likelihood of managing that without alerting the authorities to what is going on.
If we know anything, it's that development is never linear
Thanks for sharing this, great read.
This shouldn't be a major issue because of Forward Secrecy (https://en.wikipedia.org/wiki/Forward_secrecy) principles built into modern TLS protocols, which ensure that even if the public/private key scheme is vulnerable to (for example) quantum attacks, the attacks have to be done now, as a MITM for the handshake, or otherwise the full traffic capture is useless for future decryption without getting some secrets from one of the endpoints.
That being said, it's not 100% used everywhere yet (Wikipedia mentions 92.6% of websites), and various means of tricking devices into downgrading to an older protocol would result in traffic that might be decrypted later.
No, this absolutely is not how forward secrecy works in TLS. Forward secrecy protects against a break in the signature algorithm, but not in the key agreement algorithms.
Both the FFDH and ECDH key agreement algorithms are vulnerable to quantum crypt-analysis; someone capturing traffic today could later break that agreement and then decrypt the data. An attacker would have to capture the entire session up to the "point of interest" though.
This is why FFDH/ECDH are being augmented with Post-Quantum secure KEMs.
What I want to know is how they guess which 0.001% of signals or internet traffic is actually worthwhile to keep? The biggest nation states could conceivably store about 1 year’s worth of internet traffic right now, but then they also need to store whatever other signals intelligence they’re gathering for analysis, so it will be less than a single years worth.
But almost all that data is going to turn out to be useless if or when they gain quantum ability to decrypt it, and even the stuff that could be useful now gets less useful with every month it stays encrypted. Stuff that is very useful intelligence now could be absolutely useless in five years…
If you discard all major video streaming sites (including adult entertainment) then you probably can get most of the way there; you're probably mostly interested in text communication and actual user data, not the video content which is so much larger than that.
> What I want to know is how they guess which 0.001% of signals or internet traffic is actually worthwhile to keep?
By observing DNS lookups in centralized taps like room 641a at AT&T.
All of Tor that can be hoovered up seems like a worthwhile investment.
2000 years in the future people will know which porn you slobbed it to.
https://en.wikipedia.org/wiki/Utah_Data_Center
An exabyte isn't as much as it sounds like.
It was a lot more in 2014. Presumably they have upgraded it significantly since.
I wonder if there is any way of figuring out a "data space inflation" metric or similar, like money but for drive space?
So we who grew up with 500MB computers could properly communicate how big the drives "felt" at the time, compared to what we have today.
I was a few years into computer use before I got to experience a hard drive, a whopping 40MB.
I don't think this is true at anything resembling a concerning scale.
Even trying to do something like saving 'just' the average yearly traffic tor handles would account for 2-3% of all the current storage available.
We're talking about the same government that quickly abandoned their quest of 'archiving every tweet in the Library of Congress'
It's an interesting little nugget of evidence in favor of the simulation hypothesis. We're currently living through the first era in humanity's history where there will be enough raw information to create a workable global level simulation once that data is decrypted. Pair that with the fact that we're living through such a huge inflection point in history (birth of the internet, humanity becoming a multiplanetary species, and more) and you have a time where people both (1) can and (2) will want to simulate/experience. It's quite interesting.
I'm still convinced that the simulation hypothesis is just religion for the atheist or agnostic, because if it turns out that it's correct and one day you 'wake up' only to find that it was all a simulation, well how do you know that isn't now also just another simulation? It's a non-theory. But I find this some quite compelling circumstantial evidence in favor of this non-theory. Because an arbitrary number of individuals may be able to experience "this" era throughout our species' future, yet only one group will be the one that gets to actually live it, and that group will ostensibly be orders of magnitude smaller than the sum total of all that will later 'experience' it. Statistically you're rather more likely to belong to the simulation group than the real, if these assumptions are correct.
Sat in a similar briefing in 2018, sounds like the same talking points still.
> The solution was reached by using codebreaking software the team had developed along with extensive manual work, in part required because Perwich had mistakenly omitted a couple of letters in his ciphertext.
That explains how the team of 3 codebreakers got it, but what about the other codebreaker, Matthew Brown, who figured it out by himself? The article doesn't say anything about his approach. Seems impressive if he can match the effort of three cryptographers using their own custom software. I want to read more about him!
It's fascinating to me that the keywords were further encoded such that even if the message was deciphered, the strategic plans could not be acted upon.
It was quite common to have a codebook that might list several numbers or words used to substitute for places, individuals, actions, etc.
These also existed for corporate entities. A concern might have their own codebook such that the telegraph office would not be privy to their internal business.
They would also use codebooks as a type of compression, since the telegraph company charged less for sending English words as opposed to enciphered characters, and obviously, there are many uncommon words that could substitute for longer common phrases.
https://en.wikipedia.org/wiki/Codebook
https://en.wikipedia.org/wiki/Commercial_code_(communication...
Spying was a profession 400 years ago due to the intense rivalry of France and England. Vast sums and amounts of time were spent on various schemes. Probably for revenge for losing western Europe in the 100 years war. Religion seems to have been mostly a cover for plunder and stealing treasure. France remained mostly Catholic, however the religious turnover in England created interesting dynamics.
The proprietor of Maryland landed the title Baron Baltimore through friendship with the spymaster to Queen Elizabeth, "whom Calvert had met during an extended trip to the European mainland between 1601 and 1603".
"He also held the title of Earl of Salisbury in 1605 and Lord High Treasurer in 1608, making him the most powerful man at the royal court."
The Calvert family lost the Maryland estate after the armed insurrection of 1689, however the Crown returned the estate to the Calvert family in 1715 "after Charles Calvert, 5th Baron Baltimore, declared in public that he was a Protestant."
The final "23rd Governor of Restored Proprietary Government" Sir Robert Eden, 1st Baronet, was the great great grandfather of Prime Minister Anthony Eden.
https://en.wikipedia.org/wiki/George_Calvert,_1st_Baron_Balt...
https://en.wikipedia.org/wiki/Province_of_Maryland
https://en.wikipedia.org/wiki/Secret_Treaty_of_Dover
The article mentions a code breaking software - any idea what that is? I have a coded letter that is about 250 years old, but it is written in Devanagari script so letter frequency isn't straightforward. Please do suggest any tools that can help in decoding.
Do you have a link to your codes letter available online?
I don't, but I can upload a sample tomorrow in case you're interested in taking a look.
I love that they had an SCI system on top of their cipher.
It amazing to read cyphertext of events so commonly portrayed in historical dramas. One always assumes huge creative license and a degree of historical bias or other error, then one gets a text message from the past confirming everything on screen is in fact accurate.
So many machninations amongst the elites even hundreds of years ago. And imagine today, with the current set of goons in power. It's amazing they've survived for so long.
.... Satoshi?
Different Satoshi.
Not a cryptologist, not cryptology.