A National Initiative to Build a Quantum Computer

In Vienna, Virginia on April 23-25th a workshop is being held in response to a report, "A Federal Vision for Quantum Information Science" issued by the United States National Science and Technology Council. While this workshop looks, from the outside, like any other typical quantum computing workshop, this is a bit deceiving, as from what I understand this workshop is supposed to provide the impetus for a report arguing for a major spending for quantum information science in the United States, especially from the National Science Foundation. The Quantum Pontiff, unfortunately, is stuck unquantumly pontificating before his intro to computer science theory students, so he won't be able to attend the workshop. Which is all to say this is as good of place as any to write down my own thoughts on what a national initiative in quantum computing should look like. (Of course my qualifications to make such a judgment are thin at best, being a second-rate pseudo professor from the nether regions of quantum computing. But ain't blogs great. On the internet no one knows you're a research assistant professor!)

My own bias as to what any national initiative should be is already quite evident in the last paragraph, and well...it's blasted into the title of this post. I said a national initiative in quantum computing, not a national initiative in quantum information science!

Let's begin where the impetus for the workshop starts: with the report on a federal vision for quantum information science. First of all let me say how nice it is to read a document that says "federal" on its title page but that presents a very nice description of quantum information science. The authors should be commended for somehow avoiding the federal bureaucracy fudging-it-up filter (maybe it was even more technical and precise before going through this filter?! And by the way, who wrote the document, there are no names of the mysterious subcommittee which produced the document.)

The summary of the report is provided by the document itself:

The scope of the scientific challenge that must be addressed if we are to fully exploit the potential possibilities that QIS provides for 21st century technology is encompassed in the following three fundamental questions.
  • What is the true power of a general purpose quantum computer, what problems does it allow us to compute efficiently, and what does it teach us about nature?
  • Are there fundamental limits to our ability to control and manipulate quantum systems, and what constraints do they place on technology and QIS?
  • Are there exotic new states of matter that emerge from collective quantum systems, what are they useful for, how robust are they to environmental interactions, and do these collective quantum phenomenon limit the complexity of the quantum computing devices we can build?

This is a great summary of three major intellectual challenges arising from quantum information science. They are the questions that get you up out of bed in the morning wondering what new and interesting facts you will discover about how the world works. These are the bread and butter of current academic research into quantum information science, both in theory and experiment, and deserve to be funded at a high level.

But let's take a step back for a second and consider not what is included in that document, but what is missing. Do a search for the world "build" in the document. Do a search for the word "quantum computer" and look at the surrounding sentences. Notice something funny? Missing from the document is any actual mention of the desire to actually build a quantum computer!

Now, I'm going to be the first to say that support of fundamental research into the broad field known as quantum information science should be supported. I will argue to my deathbed that it's exactly the kind of fundamental research that could pan out big, not necessarily today, but in the future, reveals much about our universe, and challenges our current understanding of fundamental limits of computing in the universe. It's exactly the kind of research that the NSF should fund at a level higher than that they do today (but thankyouverymuch NSF for your support of my research :) ), and that agencies such as IARPA, DARPA, ARO, and the NSA have been funding very strongly since Shor's discovery of a quantum algorithm for factoring in 1994. But, as Dorit Aharonov so wisely said to me while at QIP last year (paraphrasing) "Quantum computing isn't interesting without a quantum computer!" Which isn't to say that the kind of research described in the report shouldn't be funded: indeed Dorit herself is a fine example of what you'll get for such funding, a bunch of awesome results challenging what we understand about information processing in the quantum world. And for researchers in quantum information science, there is value in their research, because there is the prospect that it will be much more than just exercises in their head, but that someday their ideas will find their way to actual hardware exploiting quantum effects. But, really, at some point you have to stop and ask yourself (as happened to me when Dorit chewed me out :) ) where is the major push to build a quantum computer?

There is a tendency in quantum computing to not want to promise a quantum computer. Despite what you might read in EurekaAlerts! or New Scientist, most quantum computing researchers are very conscientious of trying to avoid hype (resistance to hype being strongly correlated with distance from tenure decisions.) And, this caution, for a long while, has been very much justified. In the year 2000, for example, there were not many physical implementations of quantum computers that had passed the proof of principle barrier (ion traps being the only one in the ballpark.) But results since that time have been tremendous. Which brings us to the question, is it still true that quantum computers are "a decade away?" And, even if quantum computers are still a far off, is this being aided or abetted by the current way in which quantum computing is funded?

Recently, in thinking about what I wanted to do with my life, I sat down and did some deeper reading into the experimental progress in quantum computing over the last few years. When I graduated from Berkeley in 2001, I made a very conscious decision to focus my own research very much on the computer science side of quantum computing. I tried to work on quantum algorithms because I thought not enough people were working on quantum algorithms (and if I could make any small progress then maybe people much smarter than me, that is to say everyone, would say, "hey if Dave Bacon can work on quantum algorithms than so can I!") I worked on ideas for self-correction because I was deeply skeptical of approaches to build a quantum computer which don't take error correction as a fundamental objective of a physical implementation. To this day if there is one thing I could say to every experimentalist working in quantum computing it would be "deeply absorb how quantum error correction works!" And I most specifically did not focus on work that was closely tied to one particular physical implementation of quantum computers. This later fact was, in large part, because there were not any implementations I could latch onto as having the potential to be the one which would scale, in the coming decade, into a large usable quantum computer. But in reviewing the progress over the last five years or so, I don't believe I can as easily conclude this.

Now I'm but a mere theorist, probably only experimentally capable of reproducing Feynman's sprinkler a la a Feynman explosion, so my judgment about the viability of different potential implementations of quantum computing is probably highly suspect. But from my perspective, both ion traps and, increasingly, superconducting qubits seem to me to be close to the stage in which the questions are increasingly not the kind which focus on one or two qubit devices, but are at the stage, or will be at the stage in a few short years, where the issues will be how to construct quantum computers with hundreds to thousands of qubits. I would also say that one benefit of hanging out in a Computer Science and Engineering department, is that one begins to actually appreciate the power and difficulty of engineering and how the ultimate challenge to building a quantum computer will come explicitly from these challenges. Of course all of this is for naught if the basic one and two qubit gates and technologies for building larger numbers of qubits is not in place. Ion traps are certainly well within the regime where one can consider fault-tolerant quantum computing, the main challenge being the creation of viable scaling technologies for traps, a topic which has met with recent great success (see Wineland group's heating rates and going around corner traps.) Superconducting qubits of various forms have now implemented the basic gates, achieved impressive single qubit gates, and shown amazing progress with methods for coupling these qubits to microwave cavities.

All of this, to me, then, begs the question: is it time for a large scale push to take one or two of these implementations are attempt to scale them up? Which brings me to the title of my blog post: a big science project to build a quantum computer.

Inevitably when one thinks of projects like this, the word which immediately comes up is "The Manhattan Project" (this connection is drilled into the brain of every physics graduate student: the bomb equals funding for physics.) But, actually, there are tons of other examples from which to draw concerning projects which are engineering tour de forces, in which a community of scientists and engineers has taken the initiative to undertake a task on scale of a major attempt to build a large quantum computer. I think, of course, of large chunks of the space program, the Apollo project, the Voyager missions, etc, and also of numerous large telescopes built over the years, space based missions for astronomy, and the LIGO project. Then of course there are high energy projects like the current multi-billion dollar large hadron collider and the ITER fusion project. Each of these projects is deserving in it's own right: many challenge our sense of exploration, our quest for fundamental knowledge, and even the future of energy consumption on our planet. And, because I am fully converted, I will argue voraciously that the building of a large quantum computer can compete with any of these projects in terms of public good.

To quote from a policy piece I co-authored with Scott Aaronson:

...quantum computers will...revolutionize large parts of science... Simulating large quantum systems, something a quantum computer can easily do, is not practically possible on a traditional computer. From detailed simulations of biological molecules which will advance the health sciences, to aiding research into novel materials for harvesting electricity from light, a quantum computer will likely be an essential tool for future progress in chemistry, physics, and engineering.

Now I'm enough of a theorist to always worry about the promise of algorithms on quantum computers, but not enough of a skeptic to not think that the original insight into why we should build a quantum computer: that it can be used to simulate quantum systems (and by simulate I do not mean "measure the energy levels" hrmph) in regimes where our modern computers will not succeed. Just the other day, for example, I was learning about how certain microscopy/spectroscopy is limited by the simulation computer power needed to make sense of the data extracted from quantum simulations of these systems. I strongly believe that a quantum computer will be an essential tool for understanding large chunks of chemistry, biology, and material science.

Thus I believe that (1) the justification is there, and (2) increasingly the science is there. I also believe that a large initiative will be needed. This is mostly because of the requirements for fabrication and engineering needed to overcome the challenges faced in building a large scale quantum computer are, I believe, not the kind of challenges which are best suited for a typical academic setting. Say what you will about D-wave, for example (well, as the only example!), but at least they realize that the kind of fab and engineering needed to attempt building something like a quantum computer is not the kind you'll find within the realms of most of academia.

Now in the large sense the idea of a national initiative to build a quantum computer is fighting against history. Historically, quantum computing has been a romper room for theorists, and a place where physicists could push the limits of quantum control and our understanding of the physics of quantum coherent systems. There is an inertia here caused by a spreading of money to many different groups attempting many different physical implementations of quantum computers. But what happens when the road to build a large quantum computer becomes more apparent? I believe the path forward should be to take the implementations that show this promise and drive forward. I believe that build a quantum computer is a machine we should be shooting for, not some happenstance which helps me get funding. This is not to say that I do not support fundamental research in quantum information science: I think exactly the opposite. But I'm for my cake and eating it too: adding a major initiative to actually build a quantum computer to NSF funded support of the broad area of quantum information science.

Of course my opinion on this whole thing is worth snot if it isn't at least shared by other scientists and engineers, by the funding agencies, and by the politicians who would be at the heart of funding such a large initiative. In dark alleys at quantum information science meetings I sometimes meet those who really truly desire to build a quantum computer. We give a secret handshake, pay attention to the talks with the most promising technologies, and debate what the word "scalable" actually means. I for one, however, am coming out of the dark: I believe it is time to think of shooting for the stars and build a a large quantum computer.

There. Hopefully that will at least provide some minor fodder for the workshop. Who knows, maybe even a few other members of the secret order of "those who want to build a quantum computer" will be drawn out into the light.

Categories

More like this

So since your research focus is on error correction let me ask: what is the current overhead required for error correction? A few years ago the overhead (think of it as the number of cubits to error correct for a single cubit) was very large.

As an even crazier theorist than you, I nonetheless wholeheartedly concur. As Carl Caves once told me (approximately), we're supposed to be interested in the real world. It's what separates us from the philosophers. I love the deep questions, but the engineer in me also believes that something tangible needs to come out of what we do. It's one of the beefs I have with string theory. I use 'tangible' liberally since I believe that astronomy is a very important science; I believe observational data from quasars and things of that nature qualify as tangible, for example. But we have to do more than simply finding and then solving ever more difficult mental puzzles. At some point we have to demonstrate a broader benefit to humanity, particularly if we plan to use public money in our research.

Dave, please let me pass my highest recommendation for NASA History Division's seven volume series Exploring the Unknown: Selected Documents in the History of the U.S. Civil Space Program, especially the essay that introduces Vol 2, Ch 2: The NASA-Industry-University Nexus: A Critical Alliance in the Development of Space Exploration by W. Henry Lambright.

Needless to say, pretty much every person who is seriously planning an Apollo-class national QIS initiative is studying these extraordinarily interesting documents very carefully. :)

These NASA documents are so valuable, as to illuminate how very regrettable it is, that von Neumann's 1954 SMEC Report (which catalyzed the NASA effort) has never been declassified. Still, by "reading between the lines" of NASA's civilian documents, the main technical and stratetic themes of the von Neumann's SMEC report can be discerned with reasonable clarity.

Drawing upon these lessons from history, it is the case (IMHO) that a mathematically sound, scientifically justified, and strategically important QIS-centric SMEC document could be written today, with the acronym "SMEC" nowadays taken to stand for "Spectroscopy, Microscopy, Enterprise, and Computation" rather than "Strategic Missile Evaluation Committee."

The scale of the consequent QIS initiative would likely be quite different from Apollo's 430,000 at-peak FTE employment (80% industrial, 12% university, 8% federal). Different smaller ... or different larger ... aye laddie, that's the QIS question!

Do you think the group at NIST is then the closest thing to a Manhattan Project for QC-building? Or are they still too academic? They do seem to have a lot of people, a lot of engineering expertise and fab facilities, and decades of ion-trapping experience.

p.s. count me in for this secret order :-)

Right now there is only one effort in the world that could be considered a manhattan project for qc and that's dwave. They have the worlds best superconducting circuit design team, the worlds best superconducting fab, a gazillion experimental rigs and the only - yes thats right the only - existing proposal for a chip design that actually makes sense. nist? you've got to be kidding right?

I think quantum computers are still at least a decade away. I might be wrong, but there is a risk in asking for a large push to build quantum computers. It could well fail, not for any fundamental reasons but because it was started prematurely, before the necessary building blocks were in place. This would be very damaging to the field.

I agree that the federal report is conservative. But the quantum algorithmists have not been successful enough; your scheme has not worked. Quantum computers aren't interesting without more quantum algorithms. (Okay, too strong a statement, but I couldn't resist the parallelism.)

bobh: overheads depend on fidelity of your basic operations, architecture constraints, etc. There has been progress in getting these smaller, but they will never be zero unless we find some magic qubit.

Wilf: I might agree that D-wave is the largest effort to build an adiabatic algorithm quantum computer, but not a quantum computer.

sw: The group at NIST is clearly the largest ion trap effort in the US...

Jon: "I think quantum computers are still at least a decade away." This does not preclude getting an initiative started NOW. If you look at many of the large projects throughout history very few did what they said they would immediately. Large projects take intensive planning, lining up of all the necessary politics, and then some very hard engineering.

"This would be very damaging to the field." What is dangerous is spending all of our effort on the periphery of building a quantum computer. A field that is only ideas and not connect to the real world (as Ian emphasizes) is one that will end up in trouble (I think we have witnessed a severe downturn in that field exactly for this reason...note that this is actually independent of whether the field has potential or not.)

"I agree that the federal report is conservative. But the quantum algorithmists have not been successful enough; your scheme has not worked. Quantum computers aren't interesting without more quantum algorithms. (Okay, too strong a statement, but I couldn't resist the parallelism.)"

Quantum algoirthmists (all six of us) have not been hugely successful, indeed. But as I tried to point out, I do not believe the CSee algorithms will be the main focus of quantum computers. To me the main focus will be "anything you can do I can do better." What I mean by this is that there is a general principle that says that since physical systems can carry out task X then a computer, which is a physical system, must be able to carry out task X. The important thing that quantum computers point out is that when task X in the real world involves quantum effects, then this breaks down in terms of efficiency. This is the real potential of quantum computers (Depending on the architecture chosen and future progress in classical computers, I also think the polynomial speedups will be important and are downplayed, especially by theorists, because they are not as impressive as factoring.)

A basically related note, I just got an email from my HS friend Chris [whose arxiv history I linked in a comment a while ago]. He's at Chalmers doing superconducting qubit stuff [and I think he is on the build side], and just got a grant from IARPA.

Hi Dave,

I think your point about where you think the value of these systems lies is a very important one.

It is considerably easier to build hardware designed to run one class of quantum algorithms than all possible quantum algorithms. Said in another way, the risks in succeeding in building what you like to refer to as a quantum computer (meaning a gate model quantum computer capable of running all possible quantum algorithms) are much, much higher than the risks associated with building a special purpose system designed to run a single class of quantum algorithms.

If you think that the real value lies in some aspect of quantum simulation then you would be much better off designing something specifically for that purpose.

This line of thought eventually will lead you to the place D-Wave is at now. We basically had to choose between building something for doing a particularly valuable aspect of quantum simulation (calculating the ground state energy of a molecule as a function of its nuclear positions, ie. the definition of chemistry a la Jenson et.al. using a gate model approach) and building something for discrete optimization (using the adiabatic approach). Both have significant commercial value. What the decision boils down to is this: Package A: unknown performance of adiabatic quantum optimization algorithms + straightforward hardware with few uncertainties + big commercial opportunity, or Package B: certainty on excellent algorithmic performance + highly risky and uncertain hardware with no existing solutions to key requirements + big commercial opportunity.

We chose Package A. It was absolutely the right choice.

There is nothing wrong with starting a project to build a gate model system, hell I love big ambitious projects. But the objectives and planning of such a project have to make sense. You need support infrastructure in place that no-one other than us has been able to assemble.

I think we are far enough along with what we're doing that gate model systems build using our designers and fab will almost certainly come along WAY before the output of any government funded effort.

Great report, Dave. But I would really urge you to stop belittling yourself so much. You're certainly one of the smartest persons that I know, one of the few with a clear long-term vision, one of the few that tries to solve the really hard questions, and I'm sure you realize this.

I second Frank's comment. You do yourself a disservice, Dave. Your title is irrelevant.

I was also going to say that, whatever you may think of D-Wave and their various claims, at least they're making an attempt at producing something tangible (and, Geordie, this is *not* a blatant suck-up attempt! :)).

Geordie, you definitely are correct that "calculating the ground state energy of a molecule as a function of its nuclear positions" is a very important problem.

But does solving this class of problems really require a quantum computer? Like many people, I was very impressed when a team led by Marcus Neumann, Frank Leusen, and John Kendrick won this years CCDC Crystal Structure Prediction Contest with a perfect score (BibTeX attached).

Their success required predicting chemical ground state energies accurate to about about one part in 10^-5 (absolute), for systems of hundreds of interacting electrons.

Simplistic QIS arguments (based on concentration of measure theorems, for example) would suggest that this performance is infeasible. So these simplistic QIS arguments are misleading us, in some fashion that is poorly understood at-present ... unless perhaps some Pontiff-reader would like to try their hand at explaining it?

IMHO, one of the lessons is that QIS community has much to learn about the real-world computational complexity of ab initio chemistry ... which is a wonderful opportunity for QIS!

-------

@article{,Author = {M. A. Neumann and F. J. J. Leusen and J. Kendrick,},Journal = {Angewandte Chemie International Edition},Number = {13},Pages = {2427-2430},Title = {A major advance in crystal structure prediction},Volume = {47},Year = {2008}}

@article{Sanderson:2007gf,Author = {Sanderson, Katharine},Journal = {Nature},Number = {7171},Pages = {771--771},Title = {Model predicts structure of crystals},Volume = {450},Year = {2007}}

"A field that is only ideas and not connect to the real world (as Ian emphasizes) is one that will end up in trouble (I think we have witnessed a severe downturn in that field exactly for this reason..."

I am not sure what you mean, what downturn? Experimental quantum computing research has never been better. For theorists, perhaps there has been a downturn. This may be justifiable, though.

"I do not believe the CSee algorithms will be the main focus of quantum computers." This may be true, but it is also a cop-out answer. If you were an experimentalist, I would be satisfied. But it seems fishy to me to see theorists arguing for a huge experimental push, in order to prevent a downturn in an unproductive and disconnected theory community. I would like to hear more from the experimentalists.

(Please don't take offense, I am trying to play devil's advocate here.)

Jon: oops...cut and pasted killed me... "(I think we have witnessed a severe downturn in that field exactly for this reason...note that this is actually independent of whether the field has potential or not" should have been "(I think we have witnessed a severe downturn in string theory exactly for this reason...note that this is actually independent of whether the field has potential or not"

Devil's advocates are always welcome.

I don't quite get your devilish advocacy here, though. Why would an argument from experimentalists be more important (especially when it comes to the actual potential of quantum computers to be useful?)

My title is irrelevant (except for keeping bread on the table, so to speak), but I am a very very slow theorist, trust me. Which is why everyone should work on quantum algorithms :)

John: It might not require one, but it might be much faster with one. In these sorts of calculations often the calculation of the energy as a function of (classical) nuclear coordinates is a subroutine of some other calculation that must be called many times. For example you might be looking to find the lowest ground state energy as a function of nuclear coordinates (the structure prediction problem). Even if you could find a smart classical way to do this I would wager that a special purpose chip running phase estimation etc. would be MUCH faster and allow you to do things in practice you couldn't because of time constraints. Even if you got no algorithmic advantage at all (unlikely) a significant pre-factor speed-up would translate to a viable business.

"bobh: overheads depend on fidelity of your basic operations, architecture constraints, etc. There has been progress in getting these smaller, but they will never be zero unless we find some magic qubit."

Of course I understand all of that. I was asking for an order of magnitude based on the state of the art for QC - factor of 10...100?

Geordie says: a significant [quantum simulation] speed-up would translate to a viable business

IMHO you are absolutely right ... and this is why progress in QIS has great strategic value on *both* fronts: better binary algorithms on classical hardware, and better qubit algorithms on quantum hardware.

We have solid mathematical and technical grounds for optimism (IMHO) that coming decades will witness sustained QIS-driven progress on both fronts. :)

Depends on what you want to do and where you are at with your fidelities and what assumptions you make about your spatial architecture. But given these, to get in the regime of factoring the number is the hundrends of qubits per logical qubit.

Dave, I agree with you but wanted to hear what you thought of Sandia's efforts. They're making a large push in Si dots and ion trap scaling, and have the fab and engineering capabilities not commonly found in academia. Plus they have top young scientists from the best groups and strong cred with the broader academic community. Could Sandia be the place to try building something useful?

You wrote 'fidelities', 'assumptions', 'architecture', and 'regime' all in the same comment. Have you been writing proposals lately?

The last time I studied quantum mechanics my text book went "missing". Maybe it was my girlfriend at the time but I doubt it. It was greylians.

By Streetdreams204 (not verified) on 22 Apr 2009 #permalink

"Depending on the architecture chosen and future progress in classical computers, I also think the polynomial speedups will be important and are downplayed, especially by theorists, because they are not as impressive as factoring."

Do you know of any papers that make this case seriously? Such a paper would need to identify areas where memory requirements are limited, I/O is extremely limited, and yet polynomial speedups (at potentially much slower clockspeeds because of different architectures and overhead) would still be important. It seems like a niche to me, but perhaps it is an important one.

A specific answer to Jon's question is supplied by the 2006 PCCP review article Advances in methods and algorithms in a modern quantum chemistry program package (DOI: 10.1039/b517914a). See, in particular, this article's Section 1.2 Challenges, sub-heading Efficient Algorithms.

Where are the references to the QIS/QIT literature? There are none given.

And yet, when this article is read with reflective consideration, every page is seen to be rich in implicit references to fundamental principles of QIS/QIT. And this is a tremendous opportunity for the QIS/QIT community.

MikeB: Sandia certainly has some big advantages, and I really hope they continue their QC push.

But one point, I think, is that currently Si double quantum dots are not past the proof of principle stage. This doesn't mean they won't be able to make it (esp. with the group they've assembled), but its harder to advocate a major scaling up without at least one and two qubit gates demonstrated (so for instance a few years ago I wouldn't have included superconducting qubits as potential candidates for a major effort.) That said, they are making good progress (which we saw at SqUINT this year) and definitely the fabrication facilities at Sandia are fabulous.

You wrote 'fidelities', 'assumptions', 'architecture', and 'regime' all in the same comment. Have you been writing proposals lately?

Ha, sadly no. My proposals have included words like 'anyons', 'gadgets', and 'stabilizer' :)

Jon: I know of no papers that make this case on an architectural basis. It does lead one to favor physical implementations with fast intrinsic gate speeds (I personally wouldn't advocate ion traps here). Also the kinds of problems where you will get the polynomial speedups are those which are of a combinatorial nature, so anytime it is a data intensive problem, you probably aren't in the correct regime. I think D-wave is the place to ask for what sort of market this sort of problem has.

Dave, you are right ... and this lengthy author list illuminates that vast scope of employment opportunities in QIS/QIT ... heck, the list of Q-Chem developers alone will be likely be comparable to the total number of attendees at the Federal Vision for QIS Workshop.

This provides plenty of interesting starting points for further reflective consideration. For example, the number of authors is also comparable to the number of Nobel Prize winners for research in quantum chemistry and magnetic resonance (two disciplines in which quantum simulation has historically played a major role). :)

Dave says: "the kinds of problems where you will get the polynomial speedups are those which are of a combinatorial nature"

Dave, I agree with what you say 100%. And may I say, the minor adjustment of "algebraic" in place of "combinatorial" makes your point even more strongly.

An illuminating example is that the Slater determinants that are cherished by quantum chemists are isomorphic to the antisymmetrized polynomials that are cherished by algebraists, which in turn are isomorphic to the class of Riemannian/Kahlerian manifolds known as "Grassmannians" that is cherished by geometers.

This broad-ranging isomorphism undoubtedly has a quantum informatic aspect too. Elucidating this informatic aspect is (IMHO) a natural objective for the discipline of QIS/QIT (as broadly conceived).

What we are talking about, essentially, is the natural role of QIS/QIT in extending the unifying notion of "algebraic geometry" to the larger (and IMHO more naturally unified) notion of "informatic algebraic geometry" ... and then applying this larger unification in service of practical problems.

"Depends on what you want to do and where you are at with your fidelities and what assumptions you make about your spatial architecture. But given these, to get in the regime of factoring the number is the hundrends of qubits per logical qubit."

Thank you

"I think we are far enough along with what we're doing that gate model systems build using our designers and fab will almost certainly come along WAY before the output of any government funded effort."

Geordie: forgive this possibly naive question, but since you already have 128 qubits and all the requisite control hardware and software, wouldn't it be easy enough to demonstrate, say, a 3-qubit toffoli gate and silence the critics?

sw: If we were to build gate model systems the primary objective wouldn't be to demonstrate gates. It would be to run an algorithm. Probably a quantum simulation algorithm. Probably the one I referred to above. The reason I make this distinction is that once you set your primary objective all of the performance specifications of the sub-components (qubits, operations on qubits, couplers, etc) get set by this. Setting an objective (like demonstrating some set of gates) without reference to hardware details etc is not a good idea. I am not a fan of bottom-up design approaches. All specs and intermediate objectives need to be driven by an ultimate objective and conditioned by the strengths and weaknesses of the chips & systems surrounding them.

In my opinion one of the big reasons why there really are only a handful of serious efforts to build quantum computers (arguably only one) is that people are stuck in the bottom-up design viewpoint and it causes paralysis. It is not a good idea to try to architect what you would like to have. Nature doesn't generally comply with a theoretical computer science view of the way things should be. People can get to work building things only when they have a clear vision of the end result, that end result has clear value, and the path to get there is based on the way systems actually behave in the lab, not the way you want them to behave.

Setting an objective (like demonstrating some set of gates) without reference to hardware details etc is not a good idea. I am not a fan of bottom-up design approaches. All specs and intermediate objectives need to be driven by an ultimate objective and conditioned by the strengths and weaknesses of the chips & systems surrounding them.

This is a good point and one of the reasons answering Jon's question above isn't easy. In my opinion a large part of this comes from the religion of the threshold theorem: the view that if you just get below the threshold for fault-tolerant quantum computing, then the clouds will part, a quantum error correcting theorist will come down from the sky, and you will be able to break RSA. I guess the point is that there is merit in proof of principle work on physical systems, but once you start to show progress on that front the rules to the game shift.

Not only do I agree completely with Geordie's and Dave's two preceding posts, IMHO the system engineering aspects of their reasoning can (and should) be pushed further.

If we consider "the big five" of high-technology system engineering projects since 1935---namely radar, nuclear weapons, space travel, VLSI chips, and the HGP---we find that all of these enterprises relied on detailed system simulation to coordinate the effort, both technically and sociologically. Even today (for example) NASA requires that about 1/4 of the budget for each new spacecraft be devoted to system-level simulation ... and as these simulations unify the hardware into a system, they serve the equally vital purpose of unifying individuals into a community. And increasingly, system biology and its newborn sibling synthetic biology are no different! :)

From this point of view, a fundamental challenge in quantum computing is the ratio that Dave pointed out: the hundreds of physical qubits that are required to comprise just one logical qubit. At present we lack the capability to realistically simulate the physical state-space of even one logical qubit ... so isn't it therefore the case, that we presently lack both the system engineering capability and the community-building capability that, historically, have been at the heart of all previous large-scale technology enterprises?

This is not to say that building a quantum computer is infeasible ... it is only to point out that (if we take a lesson from history) radically new system engineering methods must be devised, for this enterprise to have a maximal chance of success, in both its technological and social aspects.

I also agree that the major goal of developing quantum information technology is to build quantum computers. Nobody can promise it, but I completely agree with the policy statement of Scott and Dave that if built they are likely to revolutionize large parts of science. Some people think that QC feasibility is a logical consequence of QM and therefore they can be built. I disagree, and my opinion is that we simply do not know. It is possible that QC can be built and given the high stakes this possibility is enough reason to aggressively try to build them.

I also think that the efforts in this direction both theoretically and empirically are likely to lead to various other important fruits. This also includes the scenario (however unlikely you may regard it) that there are some fundamental obstacles for building computationally supeior QC.

Hi Gil! Not only do I agree with your post 100%, it would be reasonable (IMHO) to make the same point even more strongly, as follows.

Every theorist is accustomed to regarding (or at least teaching) that the state-space of quantum mechanics is a (1) a linear state-space having (2) exponentially large dimension, with (3) unitary dynamical equations, and (4) a gauge-type unraveling invariance.

And experiments aside, the *really* great thing is, these postulates create a mathematical paradise! Because from these postulates we can rigorously prove wonderful ergodic theorems, spectral theorems, concentration of measure theorems, channel capacity theorems, group-theoretic theorems ... all of the beautiful theorems that make QIT fun ... and quantum physics possible to teach at the undergraduate level, too.

Now, a maxim in medical training is "don't take away your patients' illusions until you can offer them better illusions" ... and similarly, a great challenge for the mathematical QIT community is to find quantum postulates that will yield mathematical theorems that are even more beautiful than those of linear Hilbert theory.

Historically, this mathematical revitalizing process has happened fairly often in physics ... for example, Galilean/Newtonian state-spaces were slowly supplanted by Riemannian/Einsteinian state-spaces, and (more recently) the point-like state-spaces of field theory are slowly being supplanted by the stringy state-spaces of ... well ... whatever M-theory may turn out to be! And mathematical physics did not become impoverished thereby ... instead, it became immensely richer.

So in the long run, perhaps we all have to be prepared for the possibility---perhaps even the likelihood---that whenever the string-theory community gets around to telling us what the state-space of QIT is ... it will turn out not to be a Hilbert space ... it will be something even better. :)

I think that currently GaAs double quantum dots have already past the proof of principle stage. One spin qubit operation(Delft group), square root of SWAP gate in 180ps(Harvard group) and long lived spin qubit around 1 microsecond(Harvard and Delft group) have been impressively demonstrated in the experiments. I also think that Silicon quantum dot based spin qubits can emerge as an appealing approach to extending the success of GaAs spin based double quantum dot qubits. Silicon quantum dot will past the proof of principle stage soon. Actually recent experiment(Nature Physics 4, 540(2008)) have given positive answer. Sandia National Laboratories will plat an important role in this approach. I want to hear Dave's comments about this.

Tom Tu, the experiments you cite are impressive ... it seems to me that, overall, the QIT experimental programs now underway are among the most sophisticated ever attempted, in any scientific discipline.

But again, the history of science and technology teaches us very consistently, how easy it is to overlook noise mechanisms, and underestimate the difficulty of achieving the reliable system-level understanding that is required for design engineering.

With regard to to electron spin resonance in bulk solids, a very thorough (and very sobering) historical review can be found is Orbach and Stapleton, Electron Spin-Lattice Relaxation. No one should assume that electron relaxation in solids is fully understood, even now!

Controlled thermonuclear fusion in plasma reactors is another sobering example of a discipline in which small devices work well, but large devices work poorly, in consequence of noise mechanisms that have adverse scaling. When it comes to large-scale systems engineering, sometimes Nature's answer is "no", or "rethink your objectives and approach."

--------------------------

@incollection{,Author = {R. Orbach and H. J. Stapleton}, Booktitle = {Electron Paramagnetic Resonance},Editor = {S. Geschwind}, Pages = {121--216}, Publisher = {Plenum Press}, Title = {Electron Spin-Lattice Relaxation}, Year = 1972}

Who are you kidding? Your friend Scott Aaronson publically rips D-wave for even thinking about thinking about building a quantum computer. Don't pretend to be nice. Some believe D-wave or any other company should get THEIR ok first before building a quantum computer. Violations are met with ridicule and contempt, the "everyone is stupid except for me" treatment.

Those who can do, and those who can't find some dope in the pop news to ridicule so their toady followers have something to chomp on. Hey, let's talk about talking about talking about maybe thinking about getting ready to one day maybe start to think about this. Feel good? Wow. I'm impressed.

OK, you can release the killer bees now.

Jack, you're obviously a pretty smart person -- only a smart person could have written such a sarcastic post --- so why not try writing a post in a more positive vein?

The prosperity of QIS/QIT a discipline requires a shared, widespread appreciation that theorists, experimentalists, and engineers are not rivals, but partners. And also a sober appreciation that in the long-run, fundamental QIS/QIT and applied QIS/QIT must advance together, else neither will advance at all.

Let us emulating Mr. Obama in embracing Lincoln in our blog posts: "The progress of our [QIT/QIS research toward practical applications], upon which all else chiefly depends, is as well known to the public as to myself, and it is, I trust, reasonably satisfactory and encouraging to all. With high hope for the future, no prediction in regard to it is ventured. ... Though passion may have strained it must not break our bonds of affection [between mathematicians, scientists and engineers]. .. The mystic chords of [the hallowed tradition of scientific unity], stretching to every living heart and hearthstone all over this broad [planet], will yet swell the chorus of the [the global community of mathematicians, scientists and engineers], when again touched, as surely they will be, by the better angels of our [human] nature."

Aye laddie ... now that's a program! :)

Who are you kidding? Your friend Scott Aaronson
publically rips D-wave for even thinking about thinking about building a quantum computer. Don't pretend to be nice.

Because my friend rips on somebody that means I share his opinion? What universe to you live in jackee-boy? And what's nice got to do with it? Nice is for anonymous blog posters to care about, apparently. But thanks for playing!

Some believe D-wave or any other company should get THEIR ok first before building a quantum computer. Violations are met with ridicule and contempt, the "everyone is stupid except for me" treatment.

No, what some believe is that CEOs of companies should not make statements about computational complexity that are ridiculous. Do you disagree with that or are you fine with such statements?

Or maybe you're worried about comments people have made about the validity of the approach D-wave is taking. Do you actually have a concrete defense of why (1) adiabatic quantum algorithms will give usable computational speedups and (2) why noise won't limit the viability of the adiabatic algorithm? I myself can give defenses and critiques of these points, not silly statements like "you academics are such pretentious shits." Care to enlighten us, or would you like to wallow a little more in your own pile of bile?

Those who can do, and those who can't find some dope in the pop news to ridicule so their toady followers have something to chomp on.

Yes, me and my minions. Ha that makes me laugh. Or are you bagging on Scott's minions. See my minions are different from Scott's minions. Mine are actually all robots, secretly programmed from my lab in the CSE department at the University of Washington. Scott's are actually aliens. And they are all, universally, worthless people who deserve contempt from a blogger known as "jack."

Hey, let's talk about talking about talking about maybe thinking about getting ready to one day maybe start to think about this. Feel good? Wow. I'm impressed.

You put too many "thinking"'s in there.

But you must be an investor in D-wave no? What other reason would you have to ridicule the idea that someone else might actually try to get into the quantum computing game? And, whose the one who is now deciding what should or should not be done in quantum computing. That's a picture of yourself you're looking at jackee boy. And I think you need to shave, you're beginning to look like someone from the State Hospital for the Insane Number 2.

Dear John,
what you wrote (#38) is too condensed for me to understand even on the intuitive level. In particular I do not understand the idea of replacing Hilbert spaces with something different. (And I do not understand what string theory have to do with it.) best--Gil

Gil, our QSE group has a long review article on this topic (namely, numerical recipes for efficient quantum simulations on nonlinear subspaces). This article will appear in NJP pretty soon (I will email you a copy of the proofs when they are ready).

Ilya Kuprov and collaborators have two recent JMR articles that develop basically this same idea, using the language of density matrices as contrasted with the language of unraveled quantum trajectories.

The underlying geometric idea can be grasped by looking at Fig. 1 of Kuprov's second article and asking "what is the underlying Kählerian state-space to which Kuprov's Krylov subspace is tangent?" The point being, this is a natural quantum state-space geometry on which to work.

As for why the letter "K" is so ubiquitous in this literature, your guess is as good as mine!

-----

@article{Author = {Kuprov, I. and Wagner-Rundell, N. and Hore, P.J.},Journal = {J. Magn. Reson. (USA)},Number = { 2},Pages = {241 - 50},Title = {Polynomially scaling spin dynamics simulation algorithm based on adaptive state-space restriction},Volume = { 189},Year = {2007/12/}}

@article{Author = {Kuprov, Ilya},Journal = {Journal of Magnetic Resonance},Number = {1},Pages = {45--51},Title = {Polynomially scaling spin dynamics II: Further state-space compression using Krylov subspace techniques and zero track elimination},Volume = {195},Year = {2008}}

Oh yeah, Gil, I forgot to say what string theory ("ST") has to do with efficient quantum simulation ("EQS").

Well, it is helpful to recognize that ST and EQS share common goals:

(1) Both ST and EQS seek to predict the results of real-world experiments.

(2) Both ST and EQS must take great care to avoid obviously unphysical predictions (causality violations, lack of energy conservation, etc.).

(3) Both ST and EQS seek mathematical descriptions that are well-posed and computationally tractable.

(4) And finally, in service of (1--3), both ST and EQS are looking beyond linear state-spaces.

Given this commonality, it is unsurprising that both ST and EQS end-up working on the same generalized quantum state-spaces, namely Kähler manifolds (especially algebraic Kähler manifolds).

A decade ago, not many people foresaw that engineering schools would be hiring PhDs in QIS/QIT ... does this foretell that in coming years, engineering schools will start hiring string theorists? Maybe! :)

Oh yeah #II ... maybe I'd better say too, why string theory (ST) is much harder than efficient quantum simulation (EQS) ... the reasons are obvious-but-interesting (IMHO).

Both ST and EQS researchers do quantum mechanics on Kähler state-spaces, but EQS researchers (mostly) do ordinary quantum mechanics, while ST researchers do field theory ... which is much harder.

Furthermore, EQS researchers know a priori that a dynamical trajectory on a curved quantum state-space is simply the integral curve of a J-rotated gradients of a Hamiltonian potential Ï (which is about as simple as a dynamical system can be) ... and EQS researchers even know the functional form of this potential (it is just Ï = â©Ï|Ïâª) ... while ST researchers are (mostly) still trying to guess what their dynamical equations might be.

So the bottom line is, learning EQS provides a rather good starting point for learning the (much harder) basics of ST.

And I will also mention, that a much-quoted theorem of Abrams and Lloyd, to the effect that nonlinear quantum mechanics allows NP-hard problems to be solved with polynomial resources, definitely does not apply in the EQS/ST paradigm, the two reasons being that (1) a Liouville-type theorem prevents the spread of quantum trajectories, and (2) the Abrams-Lloyd assumptions of exponentially large state-space dimension is (in general) not satisfied.