The 'JVG algorithm' is a perfect example of what happens when theory meets the harsh reality of polynomial scaling. It’s a recurring pattern in the field: someone manages to get a toy circuit to factor '15 = 3*5' and the hype machine instantly pivots to 'quantum supremacy.'
As an engineer, I care about where the curve bends. If your 'supremacy' algorithm fails the moment you step out of the 'demo sandbox,' it's not a solution—it's a nerd-snipe. I'd much rather see 10 years of incremental work on error correction than 10 minutes of hype over a flawed paper.
The title of this post changed as I was reading it. "It looks like the 'JVG algorithm' only wins on tiny numbers" is a charitable description. The article is Scott Aaronson lambasting the paper and shaming its authors as intellectual hooligans.
Scott Aaronson is the guy who keeps claiming quantum supremacy is here every year so he's like the proverbial pot calling the kettle black.
Scott References the top comment on this previous HN discussion
> (yes, the authors named it after themselves)
The same way the AVL tree is named after its inventors - Georgy Adelson-Velsky and Evgenii Landis... Nothing peculiar about this imh
Adelson-Velsky and Evgenii Landis were not the ones who named their tree the "AVL tree".
In my "crackpot index", item 20 says:
20 points for naming something after yourself. (E.g., talking about the "The Evans Field Equation" when your name happens to be Evans.)
I find it especially strange that two of the authors gave their first name to the algorithm.
Like RSA?
RSA was also not given that name by its authors, the name came later, which is usually the case.
Same with RSA and other things, I think the author's point is that slapping your name on an algorithm is a pretty big move (since practically, you can only do it a few times max in your life before it would get too confusing), and so it's a gaudy thing to do, especially for something illegitimate.
I mean, considering that no quantum computer has ever actually factored a number, a speedup on tiny numbers is still impressive :P
The problem is that it's an exponential slowdown on large numbers.
I didn't get the quantum hype last year. At least with AI, you can see it do some impressive things with caveats, and there are bull and bear cases that are both reasonable. The quantum hype training is promising the world, but compared to AI, it's at the linear regression stage.
People get taken by the theoretical coolness and ultimate utility of the idea, and assume it's just a matter of clever ideas and engineering to make it a reality. At some point, it becomes mandatory to work on it because the win would be so big it would make them famous and win all sorts of prizes and adulation.
QC is far earlier than "linear regression" because linear regression worked right away when it was invented (reinvented multiple times, I think). Instead, with QC we have: an amazing theory based on our current understanding of physics, and the ability to build lab machines that exploit the theory, and some immediate applications were a powerful enough quantum computer built. On the other side, making one that beats a real computer for anything other than toy challenges is a huge engineering challenge, and every time somebody comes up with a QC that does something interesting, it spurs the classical computing folks to improve their results, which can be immediately applied on any number of off-the-shelf systems.
Hey hey, 15 = 3*5 is factoring.
my understanding is that they factored 15 using a modular exponentiation circuit that presumes that the modulus is 3. factoring 15 with knowledge of 3 is not so impressive. Shor's algorithm has never been run with a full modular exponentiation circuit.
The 'JVG algorithm' is a perfect example of what happens when theory meets the harsh reality of polynomial scaling. It’s a recurring pattern in the field: someone manages to get a toy circuit to factor '15 = 3*5' and the hype machine instantly pivots to 'quantum supremacy.'
As an engineer, I care about where the curve bends. If your 'supremacy' algorithm fails the moment you step out of the 'demo sandbox,' it's not a solution—it's a nerd-snipe. I'd much rather see 10 years of incremental work on error correction than 10 minutes of hype over a flawed paper.
The title of this post changed as I was reading it. "It looks like the 'JVG algorithm' only wins on tiny numbers" is a charitable description. The article is Scott Aaronson lambasting the paper and shaming its authors as intellectual hooligans.
Scott Aaronson is the guy who keeps claiming quantum supremacy is here every year so he's like the proverbial pot calling the kettle black.
Scott References the top comment on this previous HN discussion
https://news.ycombinator.com/item?id=47246295
> (yes, the authors named it after themselves) The same way the AVL tree is named after its inventors - Georgy Adelson-Velsky and Evgenii Landis... Nothing peculiar about this imh
Adelson-Velsky and Evgenii Landis were not the ones who named their tree the "AVL tree".
In my "crackpot index", item 20 says:
20 points for naming something after yourself. (E.g., talking about the "The Evans Field Equation" when your name happens to be Evans.)
I find it especially strange that two of the authors gave their first name to the algorithm.
Like RSA?
RSA was also not given that name by its authors, the name came later, which is usually the case.
In the original paper they do not give it any name: https://people.csail.mit.edu/rivest/Rsapaper.pdf
Same with RSA and other things, I think the author's point is that slapping your name on an algorithm is a pretty big move (since practically, you can only do it a few times max in your life before it would get too confusing), and so it's a gaudy thing to do, especially for something illegitimate.
Leonhard Euler has entered the chat: https://en.wikipedia.org/wiki/List_of_topics_named_after_Leo...
if you’re Euler you get a pass
Named after != named by
I mean, considering that no quantum computer has ever actually factored a number, a speedup on tiny numbers is still impressive :P
The problem is that it's an exponential slowdown on large numbers.
I didn't get the quantum hype last year. At least with AI, you can see it do some impressive things with caveats, and there are bull and bear cases that are both reasonable. The quantum hype training is promising the world, but compared to AI, it's at the linear regression stage.
It's a variation of nerd snipe. https://xkcd.com/356/
People get taken by the theoretical coolness and ultimate utility of the idea, and assume it's just a matter of clever ideas and engineering to make it a reality. At some point, it becomes mandatory to work on it because the win would be so big it would make them famous and win all sorts of prizes and adulation.
QC is far earlier than "linear regression" because linear regression worked right away when it was invented (reinvented multiple times, I think). Instead, with QC we have: an amazing theory based on our current understanding of physics, and the ability to build lab machines that exploit the theory, and some immediate applications were a powerful enough quantum computer built. On the other side, making one that beats a real computer for anything other than toy challenges is a huge engineering challenge, and every time somebody comes up with a QC that does something interesting, it spurs the classical computing folks to improve their results, which can be immediately applied on any number of off-the-shelf systems.
Hey hey, 15 = 3*5 is factoring.
my understanding is that they factored 15 using a modular exponentiation circuit that presumes that the modulus is 3. factoring 15 with knowledge of 3 is not so impressive. Shor's algorithm has never been run with a full modular exponentiation circuit.