ZX calculus is very interesting framework for doing cutting edge research in error correction and gate compilation, but it seems wildly off base as a means of making quantum computing accessible to a broader audience. Anything beyond the simple "teleportation is like pulling a string" picture is extremely difficult abstract manipulations.
(PhD in experimental QC)
For those of us who have decent computer science and math education and are curious about QC but have jobs in classical computing, are there any resources you recommend that are a better introduction? I understand something about being able to test large numbers of permutations of something at once, or square rooting the number of necessary operations for some functions...
> The term, “Quantum Picturalism” was coined to describe this unique approach of teaching quantum concepts visually, reducing intimidation and opening the field to broader audiences.
Sadly, the only way to truly “understand” quantum physics is through math. A “shut up and calculate” approach.
Now, my rant.
The main reason math often seems more complicated than it really is has to do with the use of strange symbols and naming conventions. It also feels like academia in the US intentionally uses non-plain language and terminology to sound smarter and exclusive.
Back in socialist countries, there was a strong effort to name these concepts using “normal” language, and that really helped. When I came to US, I’d see something like Fourier transformation and think, “Why do use this strange name?” Why not call it “conversion of a signal into frequencies” (lose translation).
Of course, maybe the reason is that it is easier to create a new word/term in Slavic languages.
>Why not call it “conversion of a signal into frequencies”
I agree, there has to be a combination of Greek & Latin parts to compile into a Germanic-style adjective sandwich of a word which describes the Fourier.
Interpolation is one such word:
Inter - between/among,
pol - fill/smooth/polish,
ation - action or process of the foregoing.
May I propose 'Signalidominotransform' or 'Signalidominomorphosis' as candidates to replace "Fourier Transformation?"
Their pictoral representation is a "shut up and calculate" notation, you can always map it 1:1 with the linear algebra formulation. It was developed as a better tool of thought for working out proofs in QC.
I had the pleasure of interacting with Duncan and Bob when XZ calculus was being developed. While I did not use their calculus for my own research, it did inspire the graphical notation I doodled with.
While I agree that naming a thing after a person makes it less clear at first glance, it is definitely not to intentionally sound smarter and exclusive; it's simply a handy short label for often subtle and complex things in an environment where you are constantly referring to it. New fields or domains will typically develop their own notation, as they often require new tools of thought. I'm sure you are not suggesting that Feynman diagrams or Einstein summations making things more complicated than they really are.
To your point: the annoying part is when the conventions clash. In early QC works an X could be the X-gate or something else completely. A good chunk of the effort in writing or understanding a QC publication was establishing the notation. After a while notation gets a bit more consolidated as conventions get naturally established. Of course, if you move from a world that has established its own conventions, e.g. behind the iron curtain, it can be frustrating to be confronted with the many eponyms in a field where you already have deep expertise. I had similar experiences just from working in teams that were using different programming languages: "Why call it a SAX parser? That's just tree-recursive descent!"
As if that is the problem with mathematics. 99.9% of mathematics isn't about how things are named. There are zero persons in the world who failed to understand the Fourier transform because of its name.
I think you are maybe right here.
> it is easier to create a new word/term in Slavic languages.
Why is this the case?
Math needs to be this way for precision. There are dozens of "conversions of signal into frequencies" besides Fourier transform: Z-transform, Laplace tansform, Wigner transform and so on. Normal language deals with few concepts, math deals with thousands.
Soviet countries wanted to isolate their peoples from western ideas. If you start teaching about Fourier transform in schools, students are going to ask who that Fourier is.
ZX calculus is very interesting framework for doing cutting edge research in error correction and gate compilation, but it seems wildly off base as a means of making quantum computing accessible to a broader audience. Anything beyond the simple "teleportation is like pulling a string" picture is extremely difficult abstract manipulations.
(PhD in experimental QC)
For those of us who have decent computer science and math education and are curious about QC but have jobs in classical computing, are there any resources you recommend that are a better introduction? I understand something about being able to test large numbers of permutations of something at once, or square rooting the number of necessary operations for some functions...
edit: found this below, but it is all ZX calculus
https://zxcalc.github.io/book/html/main_html.html
https://x.com/coecke/status/1907809898852667702
>High-schoolers excelling at Oxford Uni post-grad quantum exam, thanks to Quantum Picturalism!
thoughts?
IANAP so got curious about how ZX Calculus related to Feynman Diagrams (if at all)
Search landed on this neat summary [^1] (2024) from the lab(?) which also has a link to the original paper [^2]
[1]: https://www.quantinuum.com/blog/quantinuum-scientists-have-p...
[2]: https://arxiv.org/abs/2405.10896
Just to add, buried away there is this gh repo also: https://github.com/zxcalc/book
Ooh thanks, totally missed this!
> The term, “Quantum Picturalism” was coined to describe this unique approach of teaching quantum concepts visually, reducing intimidation and opening the field to broader audiences.
Sadly, the only way to truly “understand” quantum physics is through math. A “shut up and calculate” approach.
Now, my rant.
The main reason math often seems more complicated than it really is has to do with the use of strange symbols and naming conventions. It also feels like academia in the US intentionally uses non-plain language and terminology to sound smarter and exclusive.
Back in socialist countries, there was a strong effort to name these concepts using “normal” language, and that really helped. When I came to US, I’d see something like Fourier transformation and think, “Why do use this strange name?” Why not call it “conversion of a signal into frequencies” (lose translation).
Of course, maybe the reason is that it is easier to create a new word/term in Slavic languages.
>Why not call it “conversion of a signal into frequencies”
I agree, there has to be a combination of Greek & Latin parts to compile into a Germanic-style adjective sandwich of a word which describes the Fourier.
Interpolation is one such word:
Inter - between/among,
pol - fill/smooth/polish,
ation - action or process of the foregoing.
May I propose 'Signalidominotransform' or 'Signalidominomorphosis' as candidates to replace "Fourier Transformation?"
Their pictoral representation is a "shut up and calculate" notation, you can always map it 1:1 with the linear algebra formulation. It was developed as a better tool of thought for working out proofs in QC.
I had the pleasure of interacting with Duncan and Bob when XZ calculus was being developed. While I did not use their calculus for my own research, it did inspire the graphical notation I doodled with.
While I agree that naming a thing after a person makes it less clear at first glance, it is definitely not to intentionally sound smarter and exclusive; it's simply a handy short label for often subtle and complex things in an environment where you are constantly referring to it. New fields or domains will typically develop their own notation, as they often require new tools of thought. I'm sure you are not suggesting that Feynman diagrams or Einstein summations making things more complicated than they really are.
To your point: the annoying part is when the conventions clash. In early QC works an X could be the X-gate or something else completely. A good chunk of the effort in writing or understanding a QC publication was establishing the notation. After a while notation gets a bit more consolidated as conventions get naturally established. Of course, if you move from a world that has established its own conventions, e.g. behind the iron curtain, it can be frustrating to be confronted with the many eponyms in a field where you already have deep expertise. I had similar experiences just from working in teams that were using different programming languages: "Why call it a SAX parser? That's just tree-recursive descent!"
As if that is the problem with mathematics. 99.9% of mathematics isn't about how things are named. There are zero persons in the world who failed to understand the Fourier transform because of its name.
I think you are maybe right here.
> it is easier to create a new word/term in Slavic languages.
Why is this the case?
Math needs to be this way for precision. There are dozens of "conversions of signal into frequencies" besides Fourier transform: Z-transform, Laplace tansform, Wigner transform and so on. Normal language deals with few concepts, math deals with thousands.
Soviet countries wanted to isolate their peoples from western ideas. If you start teaching about Fourier transform in schools, students are going to ask who that Fourier is.