1) Generally the two models of QC are the digital/circuit model (analogous to digital logic gates, with some caveats, such as reversibility of operations, no-cloning theorem), and analog computation (tuning the parameters of a continuous-time quantum system in your lab such that the system produces useful output)
2) The physics/architecture/organization depends heavily on the type of computer being discussed. In classical computing, one "type" of computer has won the arms race. This is not yet the case for quantum computers, there are several different physical processes through which people are trying to generate computation, trapped ions, superconducting qubits, photonics, quantum dots, neutral atoms, etc.
3) There are several ways that you can simulate quantum computation on classical hardware, perhaps the most common would be through something like IBM's Qiskit, where you can keep track of the degrees of freedom of the quantum computer throughout the computation, and apply quantum logic gates in circuits. Another, more complicated method, would be something like tensor network simulations, which are efficient classical simulators of a restricted subset of quantum states.
4) In terms of research, one particularly interesting (although I'm biased by working in the field) application is quantum algorithms for nuclear/high energy physics. Classical methods (Lattice QCD) suffer from extreme computational drawbacks (factorial scaling in the number of quarks, NP-Hard Monte Carlo sign problems), and one potential way around this is using quantum computers to simulate nuclear systems instead of classical computers ("The best model of a cat is another cat, the best model of a quantum system is another quantum system")
If you're interested in learning more about QC, I would highly recommend looking at Nielsen and Chuang's "Quantum Computation and Quantum Information", it's essentially the standard primer on the world of quantum computation.
The Nielsen/Chuang book is what i see recommended everywhere and so am definitely going to get it. What others would you recommend?
I had recently asked a similar question about books on "Modern Physics" (essentially Quantum Physics + Relativity) here https://hackernews.hn/item?id=46473352 so given your profile, what would be your recommendations?
PS: You might want to add your website url to your HN profile since your Physics Notes might be helpful to a lot of other folks too. :-)
Most of what I use day to day in research is either specialized to my subfield or can be found in Nielsen and Chuang, so I've actually never looked at any other textbooks specifically for quantum computation. If you're interested in more of the information theory aspect, I have heard that "The Theory of Quantum Information" by John Watrous is a good text, but I have not personally read any of it.
As for Modern Physics, if you have the math prerequisites and you want a broad overview, the series of textbooks by Landau and Lifshitz would be my go-to. However, the problems are quite challenging and the text is relatively terse. I think the only other textbook that I've used personally would be Halliday, Resnick, and Krane. I didn't read a great deal of the textbook, but I do recall finding it relatively well-written.
simulation of quantum systems: quantum chemistry, nuclear physics, high energy physics, condensed matter physics, to name the most promising ones off the top of my head.
“Perhaps I could best describe my experience of doing mathematics in terms of entering a dark mansion. One goes into the first room, and it’s dark, completely dark. One stumbles around bumping into the furniture, and gradually, you learn where each piece of furniture is, and finally, after six months or so, you find the light switch. You turn it on, and suddenly, it’s all illuminated. You can see exactly where you were.” - Andrew Wiles
The hardware very much lags behind the algorithmic advances, much of the current push for new features in quantum hardware (midcircuit measurement/feedforward, phonon mode coupling, etc) often comes from theorist colleagues pestering experimentalists about whether their hardware can run their algorithms yet.
In fact, this is analogous to the original motivation for the development of classical supercomputers, physicists wanted to run expensive non-perturbation Lattice QCD calculations, so they co-designed some of the earliest supercomputer architectures.
I think the more convincing argument is that most known applications of quantum computers (sidestepping any hardware practicalities), are for niche problems (in my wheelhouse, quantum simulation), the average person has no (practically advantageous) reason to own a quantum computer.
I suspect that once quantum computers actually scale up so that you can play with them, we'll find all sorts of interesting things to do with them.
However, even now, you can imagine that if quantum computers were small enough, it would be worth it to have it just for the asymptotically fast prime generation with Shor's algorithm. I don't think that's that far fetched. Of course, people wouldn't necessarily need to know they have a quantum computer, but they don't necessarily know the workings of their computers today anyway.
Reading their paper, it does seem like this method is significantly simpler than using something like MPS, my main concern is the practical coupling regime for which this method works, I would imagine that it would fail closer to critical points in theories with phase transitions?
This is a nice practical technique for open quantum systems with relatively low entanglement. The introduction lays out exactly what regime they're aiming at:
1. Affordable (laptop scale)
2. Captures "sufficient" quantum effects (low entanglement regime; you accurately can't simulate a quantum computer with this)
3. Straightforward to implement.
From a cursory glance, it does all three. I'm slightly surprised that TWA hasn't been applied to open systems extensively before, but it was always a relatively obscure technique. I'm guessing this should be quite useful in practice for e.g. AMO and cavity systems with relatively large dissipation terms that prevent entanglement build up. However, I'd guess this wouldn't do very well near phase transitions. All-in-all, a nice new technique for a regime that didn't have too many options.
I think you will have best luck by searching for "open quantum systems" toolboxes in your language of choice. My preferences are, in order:
- QuantumOptics.jl in Julia
- QuantumToolbox.jl in Julia
- qutip in python
These are all "just" nice domain specific wrappers around linear algebra and differential equation tools. They do the "silly" exponentially expensive simulation technique that works for any quantum system. If you are interested in efficient (not exponential) simulation techniques that support only a subset of all quantum dynamics try out:
- stabilizer formalism (e.g. for error correction) with QuantumClifford.jl or stim
- Gaussian quantum optics (e.g. for laser physics) with Gabs.jl
- tensor networks (e.g. for arbitrary low-rank entanglement) with ITensors.jl
Practically speaking, QM can be taught without the assumption that students understand the Hamiltonian formalism, simply by starting with Hilbert spaces and operators on Hilbert spaces. In fact, I would claim that having taken a class on basic linear algebra would better prepare you to understand quantum mechanics than mastering classical mechanics. QM is generally taught by referencing classical mechanics, but I believe that's more reflective of the fact that most universities require classical mechanics as a core course, and students coming in to QM will have generally taken it.
2) The physics/architecture/organization depends heavily on the type of computer being discussed. In classical computing, one "type" of computer has won the arms race. This is not yet the case for quantum computers, there are several different physical processes through which people are trying to generate computation, trapped ions, superconducting qubits, photonics, quantum dots, neutral atoms, etc.
3) There are several ways that you can simulate quantum computation on classical hardware, perhaps the most common would be through something like IBM's Qiskit, where you can keep track of the degrees of freedom of the quantum computer throughout the computation, and apply quantum logic gates in circuits. Another, more complicated method, would be something like tensor network simulations, which are efficient classical simulators of a restricted subset of quantum states.
4) In terms of research, one particularly interesting (although I'm biased by working in the field) application is quantum algorithms for nuclear/high energy physics. Classical methods (Lattice QCD) suffer from extreme computational drawbacks (factorial scaling in the number of quarks, NP-Hard Monte Carlo sign problems), and one potential way around this is using quantum computers to simulate nuclear systems instead of classical computers ("The best model of a cat is another cat, the best model of a quantum system is another quantum system")
If you're interested in learning more about QC, I would highly recommend looking at Nielsen and Chuang's "Quantum Computation and Quantum Information", it's essentially the standard primer on the world of quantum computation.