I didn't realize the article was from 2016. I still have to click motorcycles and traffic lights sometimes. So nothing has changed at all! I didn't know they were using those motorcycles and traffic lights for almost 10 years already
Once quantum computers are possible, is there actually anything else, any other real world applications, besides breaking crypto and number theory problems that they can do, and do much better than regular computers?
Yes, in fact they might be useful for chemistry simulation long before they are useful for cryptography. Simulations of quantum systems inherently scale better on quantum hardware.
The video is essentially an argument from the software side (ironically she thinks the hardware side is going pretty well). Even if the hardware wasn't so hard to build or scale, there are surprisingly few problems where quantum algorithms have turned out to be useful.
It is tough to beat classical computers. They work really well, and a huge amount of time (including some of mine) has gone into developing fast algorithms for them to do things they're not naturally fast at, such as quantum chemistry.
One theoretical use case is “Harvest Now, Decrypt Later” (HNDL) attacks, or “Store Now, Decrypt Later” (SNDL). If an oppressive regime saves encrypted messages now, they can decrypt later when QCs can break RSA and ECC.
It's a good reason to implement post-quantum cryptography.
Wasn't sure if you meant crypto (btc) or cryptography :)
I will never get used to ECC meaning "Error Correcting Code" or "Elliptic Curve Cryptography." That said, this isn't unique to quantum expectations. Faster classical computers or better classical techniques could make various problems easier in the future.
From TFA: ‘One more time for those in the back: the main known applications of quantum computers remain (1) the simulation of quantum physics and chemistry themselves, (2) breaking a lot of currently deployed cryptography, and (3) eventually, achieving some modest benefits for optimization, machine learning, and other areas (but it will probably be a while before those modest benefits win out in practice). To be sure, the detailed list of quantum speedups expands over time (as new quantum algorithms get discovered) and also contracts over time (as some of the quantum algorithms get dequantized). But the list of known applications “from 30,000 feet” remains fairly close to what it was a quarter century ago, after you hack away the dense thickets of obfuscation and hype.’
I believe the primary most practical use would be compression. Devices could have quantum decoder chips that give us massive compression gains which could also massively expand storage capacity. Even modest chips far before the realization of the scale necessary for cryptography breaking could give compression gains on the order of 100 to 1000x. IMO that's the real game changer. The theoretical modeling and cryptography breaking that you see papers being published on is much further out. The real work that isn't being publicized because of the importance of trade secrets is on storage / compression.
Suppose you're compressing the text of a book: How would a quantum processor let you get a much better compression ratio, even in theory?
If you're mistakenly describing the density of information on some kind of physical object, that's not data compression, that's just a different storage medium.
Relative times are nice for recent times (e.g. "5 minutes ago" is better than "2025-12-18 13:03"), but they should "decay" into absolute times for anything that isn't fairly recent - like a week or two, perhaps.
It varies by use case. I can think about e.g. an SRS flash card where you next review is in 2 years. I honestly don‘t care if 2 years here means 21 months or 28 months, and I especially don‘t care if the next review is on 21st of February 2028 at 13:52. All I want to know is that the next review is so far in the future it may not actually happen.
That's a fair point. I'm thinking of the use case of formatting a past date on something like a social media post/comment. (For example, a comment on HN - which uses a rather long cutoff for relative dates.)
I agree with you, I also prefer absolute date stamps, including because it might be printed out, etc. However, the <time> command would allow that to work, if it is implemented in a way that allows that to work.
It is particularly annoying in a screenshot or printed document. I rarely print onto paper, but occasionally, I will "print" an interesting blog post into a PDF.
I like it but I think the granularity needs to be fixed. For example, the cutoff points should be 21+ months -> 2 years instead of 13+ months -> 1 year.
So basically you want the cutoff to be > 1.66 of the next unit before you display in that unit. That means 40 hours, 2 days; 11 days, 2 weeks; 6 weeks, 2 months; 20 months, 2 years.
I'm annoyed by things moving unbidden, especially in clickable interfaces. This element being all over Slack, chat apps, etc. means that things are always shifting around slightly and at unpredictable times.
Mostly Verisign, which required faxing forms and eye-watering amounts of money. Then Thawte, which brought down prices to a more manageable US$500 per host or so. Which might seem excessive, but was really peanuts compared to the price of the 'SSL accelerator' SBus card that you also needed to serve more than, like, 2 concurrent HTTPS connections.
And you try telling young people that ACME is a walk in the park, and they won't believe you...
And then sketchy resellers for Verisign/Thawte, which were cheap but invariably had websites that ironically did not inspire confidence in typing in your credit card number.
As GP posited, because of this headache, lots of web traffic was plain ol' HTTP. Let's Encrypt is owed a lot of credit for drastically reducing plain ol' HTTP.
They were great in the beginning, and then when you issued a few more certs than they liked you were asked to pony up some $$$, and then when you did that and actually "verified" who you were on a personal international phone call, you got a grace, and then issued a few more, they decided they didn't like you so they would randomly reject your renewals close to the expiration date, and then they got bought out by some scummy foreign outfit which apparently caused the entire CA to be de-listed as untrustworthy in all major browsers. Quite the ride.
Also, the only website I've ever encountered that actually used the HTML <keygen> tag.
either you used http, self signed if you did not mind the warning, and i remember there being one company that did offer free certificates that validated, but cant remember the name of it
This constraint allows making a linear array of all the 4 billion values, with the key as array index, which fits in 16 GiB. Another 500 MiB is enough to have a bit indicating present or not for each.
Perhaps text strings as keys and values would give a more interesting example...
That's not a constraint as much as the worst case size.
If you actually only have a handful of entries in the table, it is measurable in bytes.
A linear array of all 4 billion possible values will occupy 16 GiB (of virtual memory) upfront. We have then essentially replaced the hash table with a radix tree --- that made up of the page directories and tables of the VM system. If only the highest and lowest value are present in the table, then we only need the highest and lowest page (4 kB or whatever) to be mapped. It's not very compact for small sets; storing N numbers in random locations could require as many as N virtual memory pages to be committed.
reply