Jump to content

Talk:Quantum computing/Archive 1

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Archive 1Archive 2

Discussion tidy-up

Obviously quantum computing attracts a lot of attention, but with all due respect, this discussion page is a bit of a mess.

It would be much better if people used the separate sections of this Discussion (the table of contents is about half-way down...) to comment on particular bits of the article. Also there should be a separate section here for general quantum computing questions, as well as a lot of the speculation (i.e. things not related to the article's content).

I don't mean to whine, I just think we could be making much better use of the Discussion page to keep up a good article. Also, I would happily do some rearranging and filtering myself, I'm just afraid to hurt anyone's feelings by moving their comments...

-Svet - Tomatoman 20:26, 6 January 2006 (UTC)

Limitation on Possible Computation?

I am ignorant about Quantum computers, and I have a question. Are not all results determined statistically? If you want to distinguish among 10^100 possible answers of which only one is correct than do you not have to distinguish among 10^100 possibilities statistically without error? Does not this hint that maybe the uncertainty principle will raise its ugly, but effective head? In fact, will not this happen long before we get to the point doing calculations that we could not otherwise do?

---John Sellers Sept. 25, 200 5 (see http://www.sellers.com/)

Follow up March 13, 2006

Although I am ignorant about Quantum computers, I know a thing or two. I think the article is excellent, but my question is a good one. Either the experts have a specific answer to this question, or they do not. In either case, some discussion about it would be informative and useful.
Most people do not have much of a sense of big numbers, and when I spoke of 10^100, this is a number larger than the number of atoms in the Earth, which is roughly 10^50, and yet we can factor numbers larger than 10^100 and so have to discover even larger prime numbers for the purposes of encryption.
Even relatively “small” sets of permutations get large very fast. If you represent all the permutations of a rubic's cube on 3 inch cubes, and then stack them, the pile would be about 5.5 light years tall! However, any intelligent 16 year old who is willing to spend thirty or forty hours on it can solve the rubic's cube without quantum computing (if our brains don't somehow use quantum effects).
On other fronts, entropy says that we can't build a perpetual motion machine because entropy always increases. If such a machine had a ratchet to make it turn in a particular direction due the shaking of Brownian movement, the probability of ratchet mysteriously spontaneously jumping up in such a way as to make it go backward, is just right to assure that our machine on the average will not turn forward or backward by itself no matter how we design it.
It is not unreasonable to ask that if we make a machine which will pick our answer, whether there is some mechanism from preventing us from doing so. After all, we have to sort through more choices than all the atoms in the Earth and without making a mistake, we must pick the exact right one. Especially since we get our answer by "throwing the dice".
Please remember that entropy is purely a mechanism of statistics and owes its inevitable increase to the most highly unlikely probability of it decreasing. In addition we know of limits of certainty derived directly from the probabilistic nature of measurement. For example if we know the speed of a particle accurately, we can not know its position accurately. Isn't this suggestive that in our search for information from a quantum computer measurement may have some fundamental limitations as to what we can learn? Has anyone actually solved any problem with quantum computers that we can't solve otherwise? The fact is, quantum computers are completely unproven for actually doing non-trivial problems.
Again, the answers from Quantum computing are also purely a mechanism of statistics, so again it is not unreasonable to ask if the same constraints, namely those of entropy might not apply and limit what we can possibly do with Quantum computers?
I am not saying Quantum computing won't work, but I have the sense that there is something similar between getting multiple probabilistic answers from a machine while excepting them to converge to an entropy-free answer, and the repeated probabilistic ratcheting of a perpetual motion machine while expecting it to go forward.

---John Sellers March 13, 2006 - wee hours


---John Sellers March 13, 2006 - afternoon - Yesterday, when I wrote my follow-up, I did not have an example of what kind of mechanism might stand in the way of our using quantum computers to solve otherwise unsolvable problems. After sleeping on it, I woke up with some interesting reasoning about how this might be.

Please understand my comments in the proper context. I hope that quantum computers do work out, because of the problems they would be able to solve for us. However, I worry about the unregulated and unquestioning enthusiasm I see for the idea. Only a sense of proportion tied closely to the actual reality is going to take us in the right direction. When ideas are carried out without proper balance, they tend to either burn out when things don't work out as expected, or never get off the ground because they do not gain the attention they are due. If balance is achieved, the overall final outcome is much more likely to be better for everyone on average and less risk to each. I do not see this balance in considering quantum computing, else most comments would address the cons as well as the pros. In this day and age, everyone is doing the 'hype'. You walk into any supermarket and most items on the shelves try to look bigger than they really are...and that is not good for the best result all around. I more often see instead what I have seen several occasions here in Silicon Valley, namely grandiose and expensive efforts involving many people which turned out to be “putting roller skates on a dead horse”, instead the more thoughtful support of many small, competent groups covering many diverse efforts in many directions over long periods of time, taking on the roadblocks on one at a time as they are discovered, and knocking them down in a relatively inexpensive and effective manner. Fewer people would get extremely rich this way, but a greater number of people would make a reasonable and comfortable middle-class living. Anyone competent investor knows the importance of diversifying. I'm sometimes puzzled why there are so many razzle-dazzle schemes when they are so non-optimal and risky. Of course we need bigger companies as well for efforts that small operations can't do, but even there, the better approach would be diversified effort among smaller groups rather than the typical larger and more risky efforts. which get go away completely kill an effort if they fail. Large projects are too expensive to try multiple times, but small projects can afford to try until they get it right.

So on with my argument and questioning. The promise of use of quantum computers is tied to examining huge numbers of cases in order to find an answer. But as soon as you deal with large numbers, the conditions for success require the compounding a huge number of possibilities without any artifact what-so-ever popping up and systematically masking the correct solution. The question you have to ask is, how likely is such an artifact? Let's examine a few cases and see what is indicated.

For the first experiment, let us assume that we have monkeys at typewriters. For simplicity lets assume that the typewrites have 26 keys and a space bar, but no capitals or punctuation. Each key is equally likely to be typed. How much of the Gettysburg address can we expect to be typed once on average for every 10^100 tries? The answer is the number of characters it will take for there to be 10^100 number of possible answers, each equally as likely and of which only one is correct. Then on average we would expect that much of the address to be typed once every 10^100 tries. This turns out to be 100 x log(10)/log(27), or between 69 and 70 characters and would look like to this:

   “four score and seven years ago our fathers brought forth on this cont”...

What this tells us is that if there is any artifact what-so-ever, which has more likely than one chance in 10^100 of occurring, it is most likely to occur. Now just suppose that the artifact has only one chance in 10^88 of occurring. That is still very unlikely, in fact, in this case our monkeys would likely type, out on average, a fraction of a character more than the first 61 characters of the Gettysburg address. However, such an artifact would occur an average a trillion times in the course of our quantum computer investigation our 10^100 cases. When we do our statistics, it may very well be if there is any arbitrarily happenstance event which can occur. Just in the same way that nature hates a vacuum, nature will make sure that any free molecules floating around will certainly fill that space.

If a single Quantum computation and checking the result took an hour, we might have to wait a trillion hours to verify our result. Presumably, results are at least somewhat resistant from helping us with the next trail, because if this is not so, we could use conventional means without quantum computers to incrementally improve on the problem, such as the computer memory industry where the early attempts with the IBM PC had us install an 8000 byte board, and today we can easily install a 1 gigabyte board.

It gets worse than this. If there is likely to be one such artifact, there may likely to be a significant number of other artifacts. We can't investigate this situation easily, but we can look around and see how the world behaves in other situations of probability. I would like to present three examples along these lines. One divorced from quantum effects, one which is closer to quantum situations, and another which quantum effects strongly play a roll. Let us see how they behave.

In New Mexico there is Carlsbad Caverns. This unusual cave is a virtual fairy land of interesting cave structures. In one place you will find a room full of cave pearls formed by the happenstance dripping of water which slowly rotates the forming rocks solidifying material over the centuries to form hundreds of perfectly round rocks. In another room with slightly different conditions, you will find wonderfully delicate and intricate lace-like structures that formed over the ages. Yet in another room you will find large almost smooth structures big as tree reaching to the ceiling. In other places you find structures hanging from the ceiling that look like icicles. Here is the point. These structures are all relatively close to each other in the cave. You can go right from one room to the next to see all this wonderful variety. And yet it all results from seemingly random and arbitrary differences very stable conditions. In other words, it seems the requirement for each of these rooms is that the conditions are stable and ordered enough, and that the temperature, amount of water, the way it is applied, and material dissolved in the water are each within a certain range. The surprise is that depending on the exact values within this range, one of a surprisingly large variety of characteristic results will occur as they accumulate over time. So if we were to set up a new room for a few hundred years with stable conditions within the critical ranges, but different from all the other rooms, we would have a good chance of some different, interesting result.

When we consider this in light of the many other things we know about our world, we can reasonably hypothesize that the more ordered a situation, the more likely some seemingly arbitrary and unknown systematic effect will occur, that effect being more likely than any other effect in its favored set of circumstances. It would be extremely difficult to predict the particular effect which would occur, such as the cave pearls, without actually performing the experiment over the extremely long period of time it takes the pearls to form. The point is, that in quantum computing, we are seeking to find the absolute order of a particular situation, by looking more and more closely to the perfectly ordered solution, and somehow having to avoid the possibility that some other unwanted artifact of ordering is more likely to occur than the result we desire. Also as we are having trouble with too much order, we will also have trouble with not enough...eddy currents in super-conductors is an example of this.

This kind of ordering does not occur just at Carlsbad Caverns. A simple experiment with an old-fashioned oscilloscope will produce the same sort of amazing variety of artifacts. Take an oscilloscope and sum several pure sin waves of different frequencies, amplitude, and phases on the x axis, and take another different and independent set as input to the y axis. Next put a spike generator on the z input, which will blank out the trace on the screen except at the instant of the spike. You should make the frequency of this z input such that is a little lower than the input frequencies so we will see an arbitrary distributed sampling of the inputs rather than a string of closely spaced points. This will result in a large number of points moving around the screen in an beautifully symmetrical and intricate pattern with each point most often seeming to move independent from all the other points. These are the next order of complexity above Lissajou figures and have similar characteristics in that if you incrementally change any component of any harmonic input in just about any manner, you will get a new and profoundly complicated dance of points on the screen that seems to be completely different than the one before. The possible variety of these patterns is for all practical purposes is infinite. Each setting produces a maximally rich and different dynamic dance and ordering of the movement of the points. In fact, by coordinating the settings we could archive any analytical pattern we desire. It is interesting and significant that just about every setting will produce close to a unique different dance and yet arbitrarily similar to a near infinite number of other dances. Some are similar to others, but most often are different is some large way, and quickly jump from one pattern to a completely different one depending on how close our relative settings are to any set of harmonic ratios which densely map every possible value on the real number line. (any value on the real number line can be approximated to any degree of precision by a fractional ratio). This dance is in fact somewhat related to the dance of electrons in an atom, and its variety is governed by similar rules, except that we have explicit control over an arbitrary number of independent input frequencies, amplitudes, and phases.

This enforces what we learned from the previous experiment. That ordering under arbitrary stable dynamic conditions is likely to produce systematic artifacts. I believe that we will discover the same is true for Quantum computers as we carry out calculations. However, it may be that these can be estimated by quantum calculations developed for the purpose.

For my last example, I would like to talk about optical gyroscopes. It has been some years since I have read about them, so my information may not be up to date, however, they have been the most accurate way to measure rotation. This is done by sending light in opposite directions through a fiber optic winding and then when the two light paths return, putting them through an interferometer. The light is in phase or not, and as the gyroscope rotates, one path is long and the other is shorter because of the transit time for the light to go through the windings. The amount of phase shift is continuously measured and the amount of rotation over time can then be calculated. The interesting part for us is that there are significant number of artifacts that reduce the accuracy of this method. For example, the Zeman effect of a magnetic field on the light causes it to spread over a range of frequencies and thus effects the accuracy of measuring of the frequency shift. If I remember correctly, there are so many artifacts that the reduction in precision of the devices built at that time was about 14 orders of magnitude. It also turns out that these loses of precision are mostly, if not all, quantum in nature. Methods, used to predict precision or to minimize the uncertainty in these devices, are calculations associated with the uncertainty principles of quantum physics.

Again what we learn is that in real world applications reaching down into the level of quantum effects are often limited by quantum effects and uncertainty.

There may be hope however, giving the optical gyroscope as an example. Even with the lose of 14 orders of magnitude of precision, these gyroscopes were still the most accurate method of measuring rotation known to us.

There is another "unexpected" outcome from this note. After writing it, I notice that maybe an "unexpected" result might not be such a bad thing. When we discover new kinds of ordering. This leads us to new kinds of discovery. Maybe we can not use Quantum computers for breaking security codes, but maybe we can use them for their unpredicable and highly ordered results as a tool for finding new and useful ideas. ---John Sellers March 13, 2006 -

The statement "There is a common misconception that quantum computers can solve NP-complete problems in polynomial time. That is not known to be true, and is generally suspected to be false." seems to imply that quantum computers are not Non-deterministic Turing Machines. If this is true, please state so.

-- Christoph Reichert, 9 August 2005

Quantum computers are not Non-deterministic Turing Machines.  :) NTMs can just assume they already have the answer and just verify it. A quantum computer has more work to do than that; it's first guess may not be correct, among other differences. --David Battle 21:50, 9 August 2005 (UTC)

I have added yet more info on the size estimate of the state machine needed to simulate a N qubit register (with an example given for a 640 qubit register) at the bottom of the page. --David Battle 02:00, 8 August 2005 (UTC)


"For problems with all four properties, it will take an average of n/2 guesses to find the answer using a classical computer. The time for a quantum computer to solve this will be proportional to the square root of n".

Shouldn't that be log(n)? If not, a little explanation on this rather unexpected outcome would be appreciated...

Apart from that it's a great piece of work! --MK


This quote in the article is nonsense:

"Integer factorization is believed to be practically impossible with an ordinary computer for large numbers (eg. 10^600)."

While that's a really big number, a high school freshman can factor that big number in his head! 2^600 x 5^600. Gee, that was tough!

--Seinfeld's_Cat

Thanks for spotting this. I changed the article accordingly. Of course, you could also have changed the article yourself. -- Jitse Niesen 15:06, 30 Jan 2005 (UTC)
No problem. It sometimes seems easier to complain than to just fix it. I like your change (I couldn't come up with a suitable fix; you did it quite well.) I make a better critic than author. -- Seinfeld's_Cat 1 Feb 2005

The talk page is as interesting as the main article. Amazing! --ReallyNiceGuy


Great article!

Maybe we could add in the first paragraph that quantum computers are by their very nature probabilistic devices and only probabilistic algorithms can be implemented on them. Also, quantum computers can be simulated by Turing machines and are therefore no attack to the Church-Turing thesis. --AxelBoldt


I've now added both. They're in the complexity theory section, where they seemed to fit the flow best. -LC


Yes.

In the complexity section, it says that BQP is the class of problems that can be solved by quantum computers. These however are only the problems that can be solved by quantum computers in polynomial time. Maybe you could say "the problems that can be solved by quantum computers in reasonable time" or "that can be realistically solved by quantum computers".

Then it says that quantum computers probably cannot solve all NP-complete problems. There are two problems with this statement: strictly speaking, a quantum computer only works probabilistically and cannot "solve" any NP-complete problem (or any other decision problem for that matter) in the same sense a Turing machine solves them, with a deterministic and correct Yes/No answer. Furthermore, if we allow probabilistic solutions, then quantum computers can of course solve all NP-complete problems, just like any Turing machine can; it may just take a lot of time to do so... --AxelBoldt


It should be clearer now. -LC


What is this #P-Complete ? --Taw


See #P-Complete. -LC


How long does it take to factor an n-digit number with n qubits? --Axel


Would Quantum Computing break Elliptic Curve Cryptography ? Taw

Apparently so, through a variation on Shor's algorithm. I haven't studied it, though -- CYD

Some recent work [1] indicates that if spaced sufficiently closely, quntum entanglement between quantum dots may be possible, so it's possible that in the future a quantum computer could be implemented using quantum dots.

Also, it might be useful to mention "reversibility", the haddamard(sp?) transformation and the various types of quantum logic gates (CNOT, etc...). I'm not an expert, so I'll defer to someone who knows about this stuff -- Matt Stoker



What does it mean to have a "quantum computer with nonlinear operators"? --Robert Merkel

I hope that's clearer now. --LC

How the heck do you implement a nonlinear operation on qubits? Evolution always proceeds by unitary - therefore linear - operations. -- CYD

I agree, that seems to be a consequence of Schrodinger's equation, which is pretty basic to QM. --AxelBoldt

I agree. The papers proving the result never said it could be done. But it's an interesting result. It hasn't been ruled out that a large linear system could act like a small nonlinear system, and give the desired result. The Shi paper refers to the nonlinearity of the Gross-Pitaevskii equations, but I'm not familiar with them. I would assume the Shi paper is flawed, since it wasn't accepted anywhere, but I'm not aware of any proofs that this sort of thing is inherently impossible. --LC

Oh. I just checked the site, and the Shi paper has now been accepted in a journal. It hadn't yet been accepted when I first wrote the article. I'll remove the "not peer reviewed" note. Is anyone here familiar with the "Internation Journal of Modern Physics"? Is it generally respectable? --LC


The quantum computer in the above example could be thought of as a black box containing 8 complex numbers. Or, it could be thought of as 8 black boxes, each containing 1 complex number, each sitting in a different alternate universe, and all communicating with each other. These two interpretations correspond to the Copenhagen interpretation and many worlds interpretation, respectively, of quantum mechanics. The choice of interpretation has no effect on the math, or on the behavior of the quantum computer. Either way, it's an 8-element vector that is modified by matrix multiplication

I don't think this characterization of many worlds is correct. The different universes don't come with complex numbers attached. Instead, the more likely states are exhibited in more universes. The goal of the matrix manipulations is to bring essentially all universes into the same state. Once you measure the state, the universes stop to communicate and truly split. --AxelBoldt

Good point. I've reworded it. --LC


In the "How they work" section it says:

"The square of (a+ib) is (a^2+b^2)."

Actually the square of (a+ib) would be (a^2-b^2) + i(2ab).

Probably you mean the norm or the mod or something like that? (it's been a while since i used complex numbers, not sure of the name any more).

In quantum mechanics, (a+bi) is called the amplitude, and (a2+b2) is called the squared amplitude, even though the latter equals |a+bi|2 rather than the (a+bi)2 that the name might suggest. I suppose the terminology is counterintuitive. I'll reword the article to make it clearer. --LC


I've removed the following contribution from User:Harry Potter:

Quantum computing can also theoretically produce time anomalies. The ability of the Quantum computer to find the correct answer in a factorisation problem can be seen as running all the possibilities simultaneously. However, only the correct answer produces a positive response which is then sent back through time to become the only calculation which in fact needs to be made.

Quantum computation doesn't involve time travel in any description I've seen. -- Oliver P. 00:01 9 Jun 2003 (UTC)

Agreed. This sounds purely abstract and theoretical to me, but I am no expert on quantum computing by any means. Perhaps Harry Potter can produce a source in the quantum computing literature that lends some support to this idea? -- Wapcaplet 00:10 9 Jun 2003 (UTC)

Please specify "... have recently been built" into "... have been built in the early/mid-/late 20th century/21st century". --Menchi 08:37 14 Jul 2003 (UTC)


I found the first paragraph of the second section, titled "The Power of Quantum Computers", rather unclear. Unfortunately, I cannot tell exactly what makes it unclear - maybe it is too vague; how difficult is factorization (suspected to be) on a classical computer, and what is the speed-up on a quantum computer? I do not know enough about quantum computing to rewrite the paragraph myself ...

I liked the rest of the article very much, it contains (almost) everything that I was looking for. However, I am still wondering whether there is an upper bound on the power of quantum computers. The article says that quantum computers cannot solve undecidable problems, but is BQP known/suspected to be a subset of another complexity class, say NP? -- Jitse Niesen 11:08, 26 Aug 2003 (UTC)

They CAN solve Turing-unsolvable problems, but...

This is a technicality that should be mentioned: If you define a quantum computer as a Turing Machine with arbitrary complex amplitudes, then the class of machines that you obtain is uncountably infinite, and can easily be shown [ADH97] to contain machines that solve all kinds of Turing-unsolvable problems. But *we* can't build those QC's (or even know one if we see one) because of our inability to know the correct amplitudes required for those tasks. The computability comments in the present version of this article are valid for QC's whose amplitudes are computable numbers.

[ADH97] L. Adleman, J. DeMarrais, and M. Huang. Quantum computability, SIAM Journal on Computing 26:1524-1540, 1997.

Key exchange versus encryption

You say:

"This ability would allow a quantum computer to break many of the cryptographic systems in use today. In particular, most of the popular public key ciphers could be quickly broken, including forms of RSA, ElGamal and Diffie-Hellman."

My understanding is that RSA is the encyrption method and the ElGamal and Diffie-Hellman are key exchange protocols. The references within Wikipedia also say that. Do you mean that the encyrpted key being transmitted is able to be read? BK


Further editing

Some of the section which was there before (and is now called Initialization and termination), though written at a good level I believe contains some innaccuracies, i.e. on how the qubit registers are measured. In fact, only a fixed register is sampled on termination.CSTAR 04:27, 6 Jun 2004 (UTC)

Oops that's not what I should have said. Anyway I think it's right now. I also split the article up adding the stuff on reversible registers to quantum circuits.CSTAR 07:01, 6 Jun 2004 (UTC)

Where did the numbers come from in columns #2 and #3 in the "bits vs qubits" section?

Why is the 3-qubit "000" represented as "0.37 + i 0.04"? Where did the 0.37 and 0.04 come from? Ditto for the rest of the numbers in that table. According to www.qubit.org, a 3-qubit system can represent 8 values. There is no magic in 16 analog values vs 8 complex values, there are just 8 values total. None of the references I have come across mention anything other than an L qubit system representing exactly 2^L possibilities at once, not 2^(L+1) (the 16 you mentioned in this section).

In an effort to give the reader some sense of how probability waves can be represented by complex numbers and how complex numbers c1an be manipulated, you've obfuscated the simpler facts of qubits. 1 qubit holds 2 states simultaneously with *equal probability*, 2 qubits hold 4 states simultaneously with *equal probability* and so on.


I didn't write this example, but as an example, I think it's mostly OK (But you're right about the dimension being wrong: see below. When I read it I wasn't thinking) . What the example is trying to express is that fact that in a quantum computer the set of registers is a quantum mechanical object,and the state is described by a wave funbction on it. As far as the number of independent values, the example is meant to suggest the real dimension of the space of states, but you may be right the dimension counting may be wrong, because the relevant space is complex projective space of dimension 2n. Thanks. I'll fix it. The set of pure sets is homeomorphic to the compact symmetric space with m = 2n
As far as your statement:
1 qubit holds 2 states simultaneously with *equal probability*, 2 qubits hold 4 states simultaneously with *equal probability* and so on.
I don't believe this statement is correct.

CSTAR 03:10, 16 Jun 2004 (UTC)

P.S. From this log, it wasn't clear who wrote the question and who wrote the reponse, so I went back and edited the past, by separating my answer to the anonymous question. I'm perfectly happy to edit the past. More of us should try it.CSTAR 04:48, 16 Jun 2004 (UTC)

I think the table of numbers in the bits vs. qubits section obfuscates the concept of superposition. It might be better to start with the simpler 2-qubit case and say something like: "with two qubits a quantum computer can be in a state which is a mixture of all the classically allowed states. The particular state is determined by four complex numbers. In quantum mechanics notation we would write:
where α, β, γ, and δ are complex numbers." Bra-ket notation may also make things harder, but it is at least important to give the general version before the specific example with numbers.Bjohnson00 20:39, 29 July 2005 (UTC)
Perhaps you´re right. As mentioned earlier this example has gone through various stages of editing. íf you can change it without doing too much violence to the article, go ahead.--CSTAR.

other confusing bits

(problem I noted has been fixed -- thanks, everyone. -- DavidCary 01:43, 4 Jul 2004 (UTC))

Yes, it's rubbish, I rewrote the section. Unfortunately the bit about fault tolerant scaling in Shor's algorithm is a bit hazy, I'm just quoting vague unpublished numbers floating around my research group. -- Tim Starling 07:35, Jun 18, 2004 (UTC)
Good rewrite! Regarding range for T2, I checked Nielsen & Chuang book (pp. 337) and they claim 7 seconds and 0.3 seconds for proton and carbon in a two-qubit experiment. Should we change "hundreds of microseconds" to "seconds"? Or is there some caveat about Nielsen & Chuang data that I'm not aware of? Andris 18:37, Jun 18, 2004 (UTC)
Alright. It would be nice to explain at some stage why systems such as chloroform nuclear spins are not scalable, but I guess "nanoseconds to seconds" will do for now. -- Tim Starling 01:47, Jun 22, 2004 (UTC)

The article currently mentions the transverse relaxation time T2 (terminology used in NMR and MRI technology)

Is there really any difference between NMR quantum computers and MRI quantum computers ? I suspect they're really the same thing. -- DavidCary 01:43, 4 Jul 2004 (UTC)

I guess my comments weren't clear -- I was referring to MRI and NMR technology not computers (MRI technology, the kind used in hospitals) which are essentially the same technology (I think at least they are, since the both involve involve nuclear spin), but seem to have separate wiki articles. Is my making these two references too confusing? Please change it if you can make it clearer. CSTAR 01:51, 4 Jul 2004 (UTC)

Interpretation

Now that we're at it, I think we should remove references to interprertation (Copenhagen, many worlds etc..). Regardless of content, they shouldn't be in this article. I haven't made an effort to understand whether what's writtem in the article makes any sense, so I won't pass judgment on that account_. CSTAR 13:19, 18 Jun 2004 (UTC)

Edits of 08/01/2004

Some of the edits by an anonymous user did increase readability, but I think calling entanglement or superposition a process is misleading. Also Qubits don't hold data -- they are measures of the size of data. The register space is the actual container of data. I tried to keep the spirit of the recent edits, but modifying what seemed misleading. CSTAR 02:49, 2 Aug 2004 (UTC)

---

So, how the hell are you supposed to make software for these? I assume a normal "if something is something do this, else do this" doesn't work?

Conditionals in the usual sense, no.CSTAR 20:08, 16 Aug 2004 (UTC)

Decoherence and Complexity

This article is really great !

Nevertheless, there is one point I found unclear : the relationship between decoherence and complexity.

The article state some algortihms are in BQP. Since complexity classes are only meaningful for arbitrarily large calculations, does the definition of the BQP class includes the fact that decoherence may limits the amount of operations that may be performed, or does it assumes we have a way to prevent the effects of decoherence for arbitrarily long calculations (which doesn't seem obvious) ?

For example, an old (1995) article by P. Shor and coll. "Quantum Computers, Factoring and Decoherence", calculated a limit to the size of the integers that may be factored faster by a quantum computer than by a classic computer, which to me would contradict the fact that factoring is in BQP.

Some clarification of this point would be great !

Thanks !

That's a good question, but I'm inclined to say that the there is no relationship othre than that which is buit into the model itself. To explain rigorously BQP, you have to formulate a more precise model of quantum computation than is done in this article. The right model is that of quantum circuit, although BQP is not defined there either, although it should be. At some point I'll get around to it. Otherwise look at some of the references such as Kitaev, or Nielsen Chuang. The book by Hirvensalo is also good. I know I have evaded your question but that's the best I can do. Maybe some other wikipedian has a better answer.CSTAR 13:57, 1 Oct 2004 (UTC)
The definition of BQP assumes that computation is error-free (and decoherence-free), which is not the case in practice. The limit in Shor's 1995 article is for quantum computers which are subject to decoherence. Thus, there is no contradiction: the 1995 paper essentially says that if there was not decoherence, quantum computers could factor large numbers, but in presence of decoherence, they could not. This is the (pessimistic) view of quantum computation that was common in mid-1990s. This view is mostly obsolete now. Quantum error correction has been invented (in a later work of Shor and colleagues in 1996 and 1997) and it has been shown that arbitrarily large computations can be made fault-tolerant (with "fault" including "decoherence"), as long as fault rate is sufficiently small. Andris 00:10, Dec 11, 2004 (UTC)

Cost of quantum computations

The big selling point of quantum computations seems to be that quantum computers with enough qubits can run algorithms much faster than classicalcomputers. That's great, but it's not very useful if, for example, the cost of implementing an n-qubit register is exponential in n. Is there some estimate for the asymptotics of this? Is the apparent advantage of quantum computers due only to the fact that we're measuring computational complexity by the wrong yardstick? (I have no idea, but someone surely knows...) --Andrew 08:48, Dec 7, 2004 (UTC)

NP-Complete stuff

"BQP is suspected to be disjoint from NP-complete and a strict superset of P, but that is not known. Both integer factorization and discrete log are in BQP. Both of these problems are NP problems suspected to be outside P. Both are suspected to not be NP-complete. There is a common misconception that quantum computers can solve NP-complete problems in polynomial time. That is not known to be true, and is generally suspected to be false."

A teacher claimed that someone has found an quantum algorithm to solve NP-complete problems in polynomial time. Asking for a size estimate, he said that the algorithm description should approximately fit on a page. He's supposed to be teaching everyone the definition of NP-complete problems next week, so he should know what he's talking about. He seemed rather sure. I'd like to see that algorithm... (Haven't had time to search the internet yet.) Would sure make quantum computers (even) more interesting... Κσυπ Cyp   17:27, 9 Dec 2004 (UTC)

Regarding the previous quote: Are integer factorization and discrete log (suspected to be) outside BPP? I think this is more interesting than whether they are outside P. -- Jitse Niesen 19:05, 9 Dec 2004 (UTC)
Yes, they are thought to be outside BPP. Andris 04:32, Dec 10, 2004 (UTC)
Thanks, I updated the article. -- Jitse Niesen 13:42, 10 Dec 2004 (UTC)
Does this article mean anything? I'm a bit sceptical, since it seems to be saying that because some things can be approximated by a non-linear function, that linearity of quantum mechanics can be ignored... Κσυπ Cyp   17:15, 16 Dec 2004 (UTC)

Clustered quantum computer ?

Hello,

I don't understand a lot about the technics of quantum computers. I would like to know if it is possible (at least in theory) to distributed quantum computing. it is possible to dream about a “quantum grid”, “quantum Beowulf”, “quantum@home?”

Thank you

? David Latapie 07:31, 25 Jan 2005 (UTC)

Can quantum computers solve undecidable problems?

This article says that quantum computers cannot solve any problems that classical computers cannot solve either. However, after a recent edit, the start of Matiyasevich's theorem reads:

"Matiyasevich's theorem, proven in 1970 by Yuri Matiyasevich, implies that Hilbert's tenth problem is unsolvable (although it has recently been proposed by Tien Kieu [Int.J.Theor.Phys. 42 (2003) 1461-1478] that this result only applies to a deterministic Turing machine, and that the problem could be solved on a quantum computer using a quantum adiabatic algorithm. This is still contraversial.)."

How to reconcile both quotes? Does anybody know how contraversial Tien Kieu's work is? -- Jitse Niesen 13:36, 10 Feb 2005 (UTC)

Probabilistic Turing machines or non-deterministic Turing machines don't decide more problems than plain old Turing machines. Neither does the quantum circuit model (and equivalents). Unfortunately, I don't know enough what a quantum adiabatic algorithm is but it possibly doesn't fall under the quantum circuit model of quantum computation. I suppose I should look at the paperCSTAR 14:41, 10 Feb 2005 (UTC)


I just looked at one of his papers. It is highly suspect, making claims about probabilistic computation which under any normal interpretation are false. "Probabilistic computation may be more powerful than Turing Computation provided a suitable probability measure can be defined." Now it is well-known, (see Sipster's textbook on computation) that probabilistic Turing machines compute the same functions as normal ones.
I would take these claims with lots of salt (and maybe that sentence in Matiyasevich's theorem should be removed).CSTAR 18:15, 10 Feb 2005 (UTC)

Thank you; I came to the same conclusion. -- Jitse Niesen 21:47, 10 Feb 2005 (UTC)

Factorization

"For problems with all four properties, it will take an average of n/2 guesses to find the answer using a classical computer. The time for a quantum computer to solve this will be proportional to the square root of n...But it is also easy to defend against. Just double the size of the key for the cipher."

This doesn't sound right to me. If your key length is 2n, a classical computer would take n guesses. A quantum computer would then take time proportional to sqrt(2n), which is asymptotically the same as sqrt(n).

I didn't edit the page because I'm not an expert on quantum computing, and feel it's possible that I don't understand (if, for example, you can't do asymptotic analysis on quantium computing algorithms?). As best as I can figure, though, I would expect you'd need to square the key length?

Any problem with these four problems will take an average of (n+1)/2 guesses to find the answer: Fix the guesses you are going to try. Each guess then has probability 1/n of being correct. Then expected guesses is .

n here is the size of the search space, which for a brute force attack on a symmetric cipher is equal to 2L where L is key length. So the classical algorithm takes O(2L) and the quantum algorithm takes O(2L/2). Note that this applies to Grover's algorithm, which is not the usual algorithm used for factorisation. It's a slower algorithm for a more general class of problems. -- Tim Starling 19:22, Jun 12, 2005 (UTC)

Spin Parity Componet Quantum Computers "transistor" is here

Just recently a new device was described for Quantum Computers. It has a quantum dot with two electrons within and an adjacent empty dot the interaction is only allowed when the electrons are in countering spins with each other and this enables quantum jumping of the electrons in a stable and readable way. Then measurments will not cause decoherance and we have in effect a quantum componet that brings us out of the vacum tube era of quantum computing.

Daniel Hazelton Waters

Update: As March was beginning I looked into the quantum computing timeline and there seemed to be more inovations in the first two months of this year then there was the last two!!!


Unfortunately I believe the term Quantum superstate only relates to the state at the time of interpretation. The actual state is derived from the superspin abilities of the given molecule, The principle state of the molecule is (a) stopped hence 1 bit or the factor 0. The second state is spinning on its (b) axis , the third state spinning from (c) pole to pole and the fourth spinning on (b) + (c) = (e) therefore you qubit problem is solved.

Fivetide


Can quantum computers ever be practical for factoring?

Can quantum computers ever be practical for factoring? I seriously doubt it. Consider the following:

As you may have heard, the "dimension" of an n qubit register is 2^(n+1) - 2, meaning that it has the same information content as 2^(n+1)-2 conventional bits.

To factor integers that are large enough to be hard on conventional computers right now you would need about 640 "qubits". For example there is an uncollected $20,000 reward for factoring RSA-640 which is a number with 640 bits. On the other hand, RSA-576 has been factored and the $10,000 reward collected.

There is something called the "holographic principle" in quantum physics which states that the maximum information content of a spherical volume of space is achieved only by a black hole of that size. The information content of a black hole in bits is believed to be the surface area of the sphere measured in planck length (l_p) units divided by 4. Yes, that is area, not volume. That's why its called the "holographic" principle. The planck length is tiny, about 1.6x10^-35 meters.

So I decided to calculate the maximum information content of a "reasonable" sized quantum computer. I generously estimate that such a computer might have the same volume as a sphere with a radius of 100 meters. That's a pretty big computer. Could such a computer hold enough quantum information for 640 qubits?

The area of a sphere with radius 100 meters is 4*pi*(100 meters)^2. In planck units this is 4*pi*(6.25e36 l_p)^2 or about 4.9e74 l_p^2. So the maximum information content of a sphere of radius 100 meters is about 4.9e74 l_p^2/4 or 1.2e74 bits. Another way to say this is about 2^246 bits. But this is nowhere near the information content of a 640 qubit register which is 2^641 -2 bits. And each additional qbit doubles in information content, so the required information is exponential in the number of qubits, while the maximum information content of space only increases as the square of radius.

Conclusion, using highshool math, it is easy to show that Shor's algorithm will never be used to factor integers that can't already by factored on conventional computers, at least not on this planet. The mass of a black hole with radius 100 meters is vastly larger than the mass of the earth (though not as large as the mass of the sun). Any smaller amount of mass would have lower information density.

--David Battle 14:22, 5 August 2005 (UTC)

Re: David Battle's assertion:
"As you may have heard, the "dimension" of an n qubit register is 2^(n+1) - 2, meaning that it has the same information content as 2^(n+1)-2 conventional bits."
This is false. That is, dimension and information content are two very different things. For example, the interval of real numbers x such that 0 ≤ x ≤ 1 has dimension 1, yet contains more information than a single "conventional" bit. --CSTAR 16:28, 5 August 2005 (UTC)
Ok. Can someone give a formula for the information content of an n qubit register then? Also, I don't see how you could say that something had a certain "dimension" unless there were at least two possible values on each "axis". In that case, my estimate for the information content of a quantum register is a lower bound, and doesn't change the validity of my argument.
--David Battle 16:41, 5 August 2005 (UTC)
Sir, you have just proven the universe does not exist! (for example, a spin 1/2 lattice system with 640 sites, indeed a very modest quantum mechanical system cannot exist)
Good work! Now we can all retire and wallow in our non-existence!--CSTAR 17:17, 5 August 2005 (UTC)
Perhaps there is not as much information stored in such as system as is currently believed. Also, I'm not exactly sure what a "spin 1/2 lattice system with 640 sites" is, but are you honestly suggesting such a system contains 2^640 bits of information? 2^(2^640) microstates??
--David Battle 17:21, 5 August 2005 (UTC)
These are fairly simple models for ferromagnets etc. Morevoer, I never said (nor does the article say) anything about bits of information, I said dimension. That's why I made the clarification above. See topological dimension and Bloch sphere.
Your argument seems to be based on the assumption that a space of topological dimension n can be used to physically store (from an input device) n bits of information and physically retrieve n bits of information. For large n this is quite likely not true (for the reasons you suggest) but your argument has absolutely no bearing on physical realizabily of Shor's algorithm to factor numbers of the size you consider.--CSTAR 18:18, 5 August 2005 (UTC)
Ok, now we're getting somewhere. Current SIMULATIONS of quantum computers use 2^(n+1) bits of storage to simulate a n qubit register. This makes it intractable to simulate more than 30-40 qubits. Based on your comments it would appear to me that there should be a way to simulate quantum computers using much less memory. Could you help me calculate how many microstates are possible in an n qubit register? Thanks,
--David Battle 18:25, 5 August 2005 (UTC)
"Based on your comments it would appear to me that there should be a way to simulate quantum computers using much less memory."
Oh really? Where did I suggest that? I was talking speicifically about limits of information flow from outside to inside and conversely, not about how to simulate what happens inside.--CSTAR 18:34, 5 August 2005 (UTC)
So how many microstates would you estimate are possible "inside" and n qubit register? I do not have the math skills to make this estimate, but I would love to know. Really.
Thanks,
--David Battle 18:52, 5 August 2005 (UTC)
I don't know offhand, but surely you can calculate this by some simple lattice model (640 sites) with a reasonable Hamiltonian.--CSTAR 19:07, 5 August 2005 (UTC)
Well, sadly, I can't. I'm a CS major. I've never studied lattices, and the only kind of Hamiltonian I'm familiar with is Hamiltonian paths in graphs, which I suspect is only tangentially related to what you are talking about. And I never really quite groked complex analysis. I would say most CS majors don't understand that stuff either. Computer scientists think in terms of state machines. If quantum computing is going to be understood by CS majors it is going to need to be described in terms microstates, and state transitions. Any system where you put some bits in, perform some "operations" (which is equivalent to putting some more bits in) and then take some bits out can be modeled by a state machine (maybe a non-deterministic state machine if needed). I am just trying to visualize the shape and size of the state machine described by a n qubit register, to put it in terms that I know how to think about.
Thanks,
-David
CSTAR, please see below. --David Battle 16:45, 6 August 2005 (UTC)

Okay, let me try to explain (I studied maths and CS, not physics). A digital computer has a finite number of states (2n for n bits), but a quantum computer has an infinite number of states. For instance, the state of a 1-qubit computer is described by one complex number, which is two real numbers (the real and imaginary part of the complex number). Since a real number has an infinite sequence of digits after the comma, it can never be stored exactly on a digital computer, but you can store an approximation to it as a floating point numbers, like a float in C. If you simulate an n-qubit quantum computer, you need 2n+1 − 1 real numbers to describe the state, so you can approximate these by 2n+1 − 1 floating point variables and hope that this works. This is explained in the section "Bits vs qubits" in the article.

Now, there is an important difference between classical and quantum computers: you cannot determine the state of a quantum computer. You can only do measurements, which give you some information about the state (and change the state in the process). This is why it is not that easy to determine how much information you can get out of a n-qubit computer. There is probably a formula for it, but I don't know enough physics to make a guess.

In short, classical computers and quantum computers are very different beasts, and most of the standard models of theoretical CS like Turing machines or state machines do not apply. Perhaps it's better to think of a quantum computer as an analog computer. -- Jitse Niesen (talk) 21:35, 5 August 2005 (UTC)


I understand that "hypothetically" a qubit contains an "infinite" amount of information. But saying "there's an infinite amount of information but we just can't get to it" sounds like a cop out ot me. Physicists like to talk about microstates (see the article on Entropy), and it seems to me that this concept might form a conceptual bridge between "quantum computer" concepts and state machines. --David Battle 21:58, 5 August 2005 (UTC)

I realize that you understand more of it than I thought. I don't know enough statistical mechanics to help you further, sorry. -- Jitse Niesen (talk) 22:23, 5 August 2005 (UTC)
Ok, I found a paper by some guys are Yale who claim to show that the information content of an N qubit register grows as N^2 [2]. If that is the case I must withdraw my assertion that a 640 qubit register is impossible because there would only be about 409600 bits of distinguishable information in such a register. However, if that is the case, then such a register should be able to be simulated by maintaining only about 409600 bits of dynamic state information, although the (fixed) state transition table could be *much* larger. I am still trying to work out just how large it would be. Stay tuned.--David Battle 16:45, 6 August 2005 (UTC)
Ok, some more tidbits. I found an article by Dr. Michael Frank of the University of Florida at Gainesville which asserts that it takes one planck unit of "action" to toggle one bit. The units of action are energy times time. So, figure for the typical "unitary operator" one might flip about half of the observable bits. Further, for quantum algorithms to be efficient they need to be limited to a polynomial number of operations (for example this was one of the constraints Shor's algorithm had to meet). Now, unlike classical algorithms, quantum algorithms can't "branch" based on the current state, but rather must simply apply a series of operators to the system, and then make a measurement. So if the observable state in an N qubit register is N^2 bits (see discussion above) then one might picture each possible unitary operator as a "bit mask" of length N^2 being 1 in the positions to be flipped and zero otherwise. So, for an O(N^3) (where N is the number of input bits) algorithm such as Shor's factoring algorithm, the total information content of the transforms to be made should be O(N^5). Now this says nothing about how to simulate such an algorithm with O(N^5) classical bits in O(N^3) time, but it does give one the feeling for the "size and shape" of the hypothetical state machine involved, which was what I set out to do. The total size of the machine for factoring a 640 bit integer would probably be somewhere north of 1 * 10^14 bits or about 2^46 bits or about 8 terabytes if my math is correct. So no danger of it collapsing under its own weight into a black hole. Comments? Corrections? Raspberries? --David Battle 02:00, 8 August 2005 (UTC)


If I was a high school student, and I had the question "What is a quantum computer?", I don't think the introduction of this article would help me one bit. And the rest of the article would go straight over my head. Of course, a quantum computer is a very difficult concept, but surely the introduction could be made similar. For example, in lay speak:

A quantum computer works by simultaneously working out the answer of a question with multiple inputs at once. For example, a question might be "Is this the correct password?". The inputs are then every possible password, for example, "aaaa", "aaab", "aaac", ... , "zzzz". A classical computer would have to ask this question once for each possible password. However, in theory, a quantum computer would be able to ask the question for every password at once, and return the correct password as the answer.

Of course, the above description is in practice not true, for reasons that I don't understand to do with decoherence breaking down etc, but at least it gives a reader that has never heard of a quantum computer nor studied any quantum dynamics an idea of what all the fuss is about. I'm not going to edit the artile myself as I don't trust my own understanding of quantum computers, but it would be really nice to see an easy to read and understand description of what a quantum computer is in the introduction.

Your description is wrong. A quantum computer is only exponentially faster than a classical computer for a very small set of problems. It can't, even in theory, check all passwords at once. In fact, a quantum computer probably wouldn't be faster than a classical computer for everyday tasks like playing Unreal Tournament, since the "clock speed" in most quantum computer proposals is far slower than current classical computers. Anyway, about the introduction... I can't think of a better one, I'm always at a loss to explain these things to people who don't know quantum physics. -- Tim Starling 12:14, August 16, 2005 (UTC)
========

I'll not make comment of whether the "passward" description is right or wrong, I do agree with the modivation that a plain language is needed to attract people to this field. My definition of "people" is broader than being high school students. It includes engineers and even some science majors etc.

Can some one use a very short statement to tell people what quamtum computing is all about? Please do every body a favor by not to talk about things such as "dicoherence" etc. These terms only scare people away (some of these people could have made a lot of contribution otherwise). I believe, being able to use a plain language requires real understanding of the subject. And I am not sure how many people are qualified to do the job. At least it worths to try. It will not hurt if you do not claim your self to be correct.

Spontaneous decoherence as a fundamental limitation

Someone well-versed in QM should incorporate this finding in the article:
Quantum computers that store information in so-called quantum bits (or qubits) will be confronted with a fundamental limitation. This is the claim made by Dutch theoretical physicists from the Foundation for Fundamental Research on Matter (FOM) and Leiden University in an article recently published in the journal Physical Review Letters.

A quantum computer can only function if the information exists for long enough to be processed. The so-called coherence of the qubit ensures that the quantum information remains intact. The researchers have now discovered that the coherence spontaneously disappears over the course of time and with this the stored information as well. This could pose a considerable problem for the development of a quantum computer.

Gyan 21:53:40, 2005-08-25 (UTC)

The paper only states that measurement causes decoherence, which seems obvious to me, after all that's the whole idea of measurement. The game is to keep your system isolated from the measurement device while it is evolving, then to only measure it once it has done its work. Here's an enlightening quote about their model:
At time t = t0 we instantaneously include qubit a(b) in the Lieb-Mattis (infinite range) interactions of the spins on the B sublattice of the Lieb-Mattis machine. We then follow the exact time evolution of the coupled N + 2 particle system at t>t0
They discuss this interesting measurement model and put an upper bound on decoherence time, but I fail to see any threat to the viability of quantum computing. After all, practical quantum computer designs are informed by experimental decoherence times, which are necessarily shorter than any theoretical limit. -- Tim Starling 07:29, August 28, 2005 (UTC)
Again, a disclaimer that I'm not well-versed, so take it easy, but the PR goes on to state, "Much to their surprise they discovered that the coherence tends to spontaneously disappear, even without external influences.". Isn't a 'measurement' an external influence? - Gyan 23:50:04, 2005-08-28 (UTC)
Well yes, you'd think so. The press release does beat it up a bit, but I guess that's what press releases are for. It decoheres without influences external to the measurement device, but the measurement device is indeed external to the qubit. -- Tim Starling 00:25, August 29, 2005 (UTC)
Maybe I should add another word of explanation. A decoherence time of one second doesn't mean the quantum computer has one second to do all its operations and then that's it. It means the operations have to be fast enough such that quantum error correction can be implemented with a reasonable probability of success. One second is a decoherence time which solid state physicists can only dream of, usually you'd expect it to be on the order of microseconds. Gate operations have to be much faster than this for long-running calculations to be possible. -- Tim Starling 00:38, August 29, 2005 (UTC)
Can you explain what the researchers were much surprised by? - Gyan 06:07:59, 2005-09-01 (UTC)
Let's take this to private email. Tell me if you didn't receive my message, user-to-user email is unreliable. -- Tim Starling 01:21, September 2, 2005 (UTC)


quantum algorithms

Often, people say that there are only two useful quantum algorithms, one is Grover, the other is Shor. I think it is false. It is true main ideas of quantum algorithms can be represented by these two.

But, Shor's algorithm is now generalized to Abelian hidden subgroup problem (or period finding problem),

and some extention to non-abelian hidden subgroup problems.

As an example of the latter, we can name Holgren's algorithm for pell's equation, which is useful(?) to crack some PKCs.

Also, it is important to note that the key part of Shor's algorithm is so-called 'phase estimation', or estimation of an eigenvalue of given unitary matrix. This part can be applied to counting problems, realizing square-root speed up.

Also, gorver's algorithm is applicable to amplify success probablity of many classical & quantum algorithms, as Brassard, Hoyer,etc had pointed out. (this point was already made in the article, but my point is that the range of application is much wider )

In the end, I'd like to point out that quantum walk now supplies a new tool-box to realize square-root speed up of many of classical algorithms based on markov chain.

Cosmic Implication of Quantum Computers

How does the wave function of the universe differ from a quantum computer? Weren't ALL the particles in the observable universe originally entangled? When did the first "decoherence" take place, and what triggered it?

If I understand the theory at all, we are trying to build a quantum computer that can be "programmed" to detect things like prime factors of very large numbers. How do we know we aren't living in a cosmic quantum computer that is "programmed" to detect "observers"?


Table is wrong

The table in the section Bits vs qubits has glaring arithmetical errors: The absolute value of the complex numbers in the second column cannot possibly be equal to the numbers in the third column. To fix this, pick eight angles

and replace the kth row of second column by

--CSTAR 20:04, 11 November 2005 (UTC)

Oops MY BAD, take that back... table is correct --CSTAR 20:09, 11 November 2005 (UTC)

Computational models category

Should this be put into the "computational models" category? - JustinWick 23:31, 16 November 2005 (UTC)


I work in Biomolecular NMR and came upon this accidentally, I was interested to read that there is some possibility of using solution state NMR as a quantum computer, is this really viable and is anybody working on this, the solid state (KANE) computer does seem very promising. How long does a solution system need to remain coherant for even a single quantum calculation to take place? is coherance acheived via rf pulses or by some other means, in NMR the system remains coherant for <10-3 Second is this long enough for a calculation?


Quantum byte!

Here: http://www.i4u.com/article4684.html

Other uses of quantum computers

I feel that the section "The power of Quantum Computers" overlooks some of the important possible applications of a quantum computer. This section seems to focus mainly the applications of Shor's algorithm to cryptology, and Grover's search algorithm. A short mention on a quantum computers use with regard to quantum simulations is made.

I would add two more important (theorised?) applications. Firstly, the possibility of quantum computers gernerating not pseudo but truly random numbers; and secondly, the phenomenon of entanglement and quantum teleportation. -Ray 28th Jan 2006

Conventional computers are already able to generate truly random numbers. See Hardware random number generator. -- JS, 29 Jan 2006

Use of NP-complete is wrong

The following paragraph seems wrong:

Consider a problem that has these four properties:

1.The only way to solve it is to guess answers repeatedly and check them, 2.There are n possible answers to check, 3.Every possible answer takes the same amount of time to check, and 4.There are no clues about which answers might be better: generating possibilities randomly is just as good as checking them in some special order.

... These types of problems are classified as NP-complete, and many examples that are algorithmically similar to encryption cracking can be seen at List of NP-complete problems.

For me this seems not accurate. For example, Graph-Isomorphism has this kind of attributes, but isn't NP-complete. Further more, I think that the most interesting part is whether BPP equals BQP or not. It is conjectured that factoring isn't in BPP, and this is the main reason for their power. sattath (talk)

Good catch. Changed! My edit may be a little heavy on the language. If you would, please look it over. Skippydo (talk) 07:52, 23 March 2008 (UTC)

breaking news - d wave quantum computer

D-Wave demonstrated their 16 qubit quantum computer today.

[3] for more info, specifically World’s First Commercial Quantum Computer Demonstrated

I think the way this information was included should be changed or perhaps removed. It sounds more like a PR release instead of something in an encyclopedia article. Very Quiet 13:50, 16 February 2007 (UTC)

Randomised?

The article claims "Quantum computers only run randomized algorithms". I don't believe this to be a true statement. For example quantum error correction is completely deterministic in the sense that if the input is quantum data corrupted by noise within a certain threshold the output is fully determined. Sigfpe 21:33, 20 March 2007 (UTC)

Interesting, however, Random Algorithm+Error Correct=Random Algorithm. But to your point, a quantum computer can always run in classic mode and can therefore run non-random algorithms. Perhaps something to the effect "All quantum algorithms are randomized algorithms". Sill, I don't believe my suggestion addresses your concern. 125.14.79.216 16:28, 23 April 2007 (UTC)

How about "Quantum computers usually run randomized algorithms"? I gather there are deterministic quantum algorithms. User:Ben Standeven at 19:39, 9 May 2007 (UTC)

I went ahead and changed it to "Quantum computers only run probabilistic algorithms".125.14.79.216 03:04, 11 May 2007 (UTC)

I still don't believe this is true. For instance, the Deutsch-Jozsa algorithm returns a result with probability 1. Therefore, it is completely deterministic. This is perhaps a special case, as most other quantum algorithms require some element of randomness.Bjohnson00 04:26, 30 October 2007 (UTC)

All classical deterministic algorithms are classical probabilistic algorithms (P is a subset of BPP). The probability simply happens to be one. Skippydo 13:48, 30 October 2007 (UTC)

Question

I was just wondering which sense of the word "dimension" was used in, when talking about the set of states. It would be n if it's the vectorial dimension of the states, 2n-1 if talking about the number of free functional parameters available as coefficients (amplitude and phase except the last one fixed by normalization), 2^something if talking about set theory with something as a function of n. Where does 2^n+1 -2 come from? Thanks a lot.

  • See the section "Can quantum computers ever be practical for factoring?" in the archived talk page for a discussion of the dimensionality of the set of states. The short answer is that a quantum register is described by 2^n complex numbers. But the normalization constraint means that we actually need only 2^n-1 complex numbers. Or if you want to talk about the number of real numbers needed, just multiply by 2 to get 2^n+1 - 2.Bjohnson00 17:52, 29 May 2006 (UTC)

Request for references

Hi, I am working to encourage implementation of the goals of the Wikipedia:Verifiability policy. Part of that is to make sure articles cite their sources. This is particularly important for featured articles, since they are a prominent part of Wikipedia. Now this article has an extensive list of additional material, but list of further reading is not the same thing as proper references. Further reading could list works about the topic that were not ever consulted by the page authors. If some of the works listed in the that section were used to add or check material in the article, please list them in a references section instead. The Fact and Reference Check Project has more information. Thank you, and please leave me a message when a few references have been added to the article. - Taxman 19:18, Apr 22, 2005 (UTC)

  • I am embarking to add some references to this article in coming weeks. So expect to see a bunch of edits from me. After adding a first reference to an arXiv paper I noticed that I used a different citation style for arXiv than has been previously done in the Further reading section. Is that style the accepted standard for arXiv references on Wikipedia? --Bjohnson00 16:17, 10 May 2006 (UTC)

Boondoggle

I think that QC is a boondoggle. I say this with a heavy heart, as I was quite entranced by it as a postdoctoral researcher in Hans Mooij's group at Delft, working on single electronics circuits.

The theory is beautiful and elegant, and the possibilities are incredible-- and I mean "incredible" as in "not credible". In order to implement the kind of operations such as in Shor's algorithm, you have to rotate the "qubit vector" by, for example, turning a signal on and off, such as the gate voltage employed by Nakamura et al.. The problem is that you need to be able to rotate the vector by , which translates to duty cycle times far below a nanosecond for solid-state systems (which are scalable) even at liquid helium temperatures.

No you don't need to rotate to accuracy 2Pi/2^N. Errors in quantum gates ADD, therefore to achieve constant accuracy for a polynomial sized algorithm you need accuracy which is basically one over the size of the algorithm. For Shor's algorithm, for example, to factor a number of size N, you need an accuracy of order 1/log^c(N), where c is around 2. Further, you never need in quantum computation to have a duty cycle (clock) which scales like this: you need only a universal set of quantum gates: "higher phase gates" are then made out of the these gates. To understand why accuracies don't build up in such constructions, you then need to understand why fault-tolerant quantum error correction works.

I would love to be proven wrong, but after being in the trenches for a while and an observer since then, I've lost faith in QC. Keeping 10^3 - 10^4 qubits coherent in any sort of man-made environment seems irreproducible at best.

Austin Fowler showed that Shor's algorithm still works if you skip rotations smaller than , see [4] or [5]. -- Tim Starling 06:53, 4 April 2006 (UTC)


Sounds alot like the situation with nuclear fusion power. Lots of promise, no reason theoretically it can't work, lot's of smart physicists devoting their careers to it... yet it's somehow always 20 years into the future.

Yes, except that the payoff for society is vastly lower. Nuclear fusion would provide a clean, cheap, renewable energy source. Quantum computation would provide the ability for particularly wealthy governments to snoop on the communications of citizens. Most of the money comes from defence, so other applications haven't been seriously developed. Some have suggested that quantum computation would provide gains for computational chemistry and related fields, but such applications are certainly far more complicated than factorisation. -- Tim Starling 11:24, 12 April 2006 (UTC)
Actually I think that gains in computational chemsitry are actually a lot less compicated than factorization. For example recent work in Martin Head-Gordan has shown that quantum computers with around fifty qubits will beat todays classical algorithms (as opposed to in the thousands for breaking todays codes.) Dave Bacon 20:55, 28 April 2006 (UTC)
This isn't a forum, people. This has nothing to do with the article. Whether it is a "boondoggle" or not, it exists in some form and has an article. Please keep this discussion on how the article should be edited and keep opinions out. —Preceding unsigned comment added by 128.227.98.88 (talk) 16:37, 16 October 2007 (UTC)

Bias towards NMR

This article seems to have an unfair bias towards the NMR implementation (simply in the amount it covers and refers to it, and even the first picture at the top), while there is yet no separate page on NMR quantum computing. Would it be possible to move the NMR info into its own article without detracing from the content of this? I appreciate it would be harder to explain some of the concepts without a particular implementation in mind...

And yes, it's obviously an interdisciplinary field, so it's impossible to separate everything out nicely. Tomatoman 20:12, 6 January 2006 (UTC)

I agree that this article focuses too heavily on NMR implementations. In particular, the section on "initialization, executation, and termination" includes a discussion of ensembles which is only valid for NMR, and so should be moved into an article just on NMR quantum computing.--Bjohnson00 23:08, 6 March 2006 (UTC)

Quantum Computers work while without power?

[6] Contemplating exactly what this link entitles; I'm somewhat in the dark. I am assuming however that the architecture of the quantum computer is such that its program works on the photon level without electrons? Meaning, they aren't bombarded by the energy current from a wire and this is what they mean by "switched off" but on the quantum level, natural photon movements interact with the architecture/program without the need for conventional energy? Or is this purely a software phenomenon (but even then, "running" at the proxy software level requires a movement of electric current i.e. electrons correct?). My hope is someone who knows more than me may be able to add the "working while off" phenomenon of quantum computers as per the link to the article here. Nagelfar 22:47, 24 February 2006 (UTC)

In case you want a quicker answer, post your question at Wikipedia:Reference Desk/Science

It's my understanding that binary computers can have two states in their transistors: on or off, while quantum computers have 32 different states. Isn't this a better way of describing it rather than have all those numbers?

It's always wise to view these kinds of results with skepticism, especially when they have passed through the hands of New Scientist, which aims to spin scientific results to sound as weird as possible, taking them to the limits of plausibility and often beyond. The Nature article makes a bit more sense, although the full text is not freely available.
The procedure for this kind of experiment could be more matter-of-factly described as follows:
  1. Run the algorithm on the input data
  2. Simultaneously, pass the input data through without running the algorithm
  3. Interfere the results of 1 and 2 in quantum fashion
  4. Contrive your measurement so that your detectors only pick up something if the result of the algorithm has some particular value and the particle being measured came through the non-running arm of the experiment. Particles which went through any other path are discarded without being measured.
Despite the fact that the particle being measured didn't have the algorithm run on it, the running of the algorithm is absolutely necessary to obtaining the correct result. If you switched off the power to the algorithm unit (if it requires power) or removed it, the device would immediately stop working. This is because particles from the "running" part of the device interfere with the path of particles from the "non-running" part of the device. That this is possible is easier to understand from the point of view of wave propagation rather than the Copenhagen interpretation, so predictably, the popular press chooses the latter. Under the Copenhagen interpretation, one might claim with a straight face that the particle from the running half of the experiment never existed, and that the particle from the non-running half was perturbed by some kind of magical knowledge of the workings of a distant computer. This, I would argue, is misleading. -- Tim Starling 02:55, 17 April 2006 (UTC)


Merge request from Quantum algorithm

This is the entry:

"A quantum algorithm is an algorithm designed for use on a quantum computer." (and some links) Ripper234 20:49, 2 April 2006 (UTC)

Agree. Dictionary definition doesn't warrant its own page at this time. --Officiallyover 04:36, 3 April 2006 (UTC)
Moved. Ripper234 15:44, 3 April 2006 (UTC)

Superior to classical computers?

There is one other problem where quantum computers have a smaller, though significant (quadratic) advantage. It is quantum database search, and can be solved by Grover's algorithm. In this case the advantage is provable. This establishes beyond doubt that (ideal) quantum computers are superior to classical computers.

Doesn't this just establish that ideal quantum computers are superior to classical computers for dabase search, and not in general? Or am I missing something?—Matthew0028 17:56, 4 May 2006 (UTC)

Its true that not every classical algorithm has a (known) "accelerated" quantum equivalent. However, since a QC could always be run in classical mode, its never slower than a classical machine. There are other problems which are so slow for a classical machine as to render them impossible altogether, such as factoring. But thanks to Shor's factoring algorithm, factoring large numbers is entirely feasible in a quantum one. Finally, you shouldn't knock database search - the problem is extraordinarily general. The basic techniques can be applied to search space problems, meaning that classical problems that can only be solved by brute force are accelerated in a QC.

So the basic point is, "A QC is frequently faster, and never slower."--Hyandat 03:31, 5 May 2006 (UTC)

I think that this article is clear on what QC can do better than classical computers, but it never mentions on WHY it is better...what makes it better than classical computers? How is it that these algorithms can be run on QC and not classical? how about its ability to do parallel computations? or the significant gain in using resources like memory? Janechii 11:31, 8 May 2006 (UTC)

How does Shor's algorithm work?

How does Shor's algorithm work? Our article on this subject currently contains the view of David Deutsch, which he happens to view as proof of the many worlds interpretation of quantum mechanics. Are there are any other interpretations of how it physically works, by physicists (as opposed to internet amateurs?) If so, these other POVs should be added to the Shor's algorithm article. RK 20:30, 11 June 2006 (UTC)

http://wiki.riteme.site/wiki/Shor%27s_algorithm#How_does_Shor.27s_algorithm_work.3F

Maybe the section should be called "Why Shor's algorithm works", since how it works is described in that article. Even the question "why" is really equivilent to "Why does quantum mechanics work?" since noone who believes in quantum mechanics can deny Shor's algorithm - it follows logically and provably. I'm not sure anyone today can answer "Why QM?" rightly. That's just the way things work. If you wonder what aspect of quantum mechanics makes Shor's algorithm faster than any possible classical algorithm, I think the right answer is entanglement. Turing machines are hopelessly local, and so can never properly do entanglement, which is nonlocal. In terms not unfamiliar to computer scientists, quantum computers can delay the resolution of a complex sequence of contingencies (by maintaining a superposition) which it can later resolve instantaneously, with no time overhead, upon a measurement (defies locality). Turing machines, being local, will always require at least some time to resolve contingencies, since locality requires there be communication between the various bits (as in the Greenberger-Horne-Zeilinger game).

To the best of my knowledge, this is the only thing a quantum computer can really do that a turning machine is incapable of simulating. In fact, the Gottesman-Knill theorem shows that a quantum algorithm that doesn't make enough use of entanglement can be simulated efficiently. I know I'm not alone in my view that entanglement is the key to this mystery. The explanation as posted under Shor's algorithm seems strange to me, because it seems to suggest we can borrow computing power from other worlds. But in that case, Shor's algorithm would only be accelerated in the world doing the borrowing, so that from our point of view it would only be faster some of the time.--Hyandat 21:34, 11 June 2006 (UTC)

But wouldn't the various worlds essentially be pooling their computeing power for Shor's algorithm? All the worlds are trying to factorize the same number, so each world does their own bit of work and the results interact with all the other worlds with information flowing both ways. Hence all the worlds would experience accelerated computing. Tomgreeny 15:30, 22 March 2007 (UTC)

Not _all_ of the worlds; just _most_ of them. If you look at things the David Deutsch way, the error rates are telling you the fraction of worlds that don't get the right answer. --76.216.10.112 (talk) 09:26, 1 March 2009 (UTC)

Simplification required

Even in the introduction, this article lacks a simple explanation for the novice. I suggest a very simple explanation at the top, that anyone could understand, then progress deeper and deeper into the subject in subparagraphs.Landroo 14:48, 8 July 2006 (UTC)

I agree, a sort of example or very layman's terms explanation of what a quantum computer is, and what it is capable of doing over a present day computer. Eridani 0123, 12 January 2007 (EST)


I am not all convinced that either issue can be further simplified. A QC exploits the unique features of quantum mechanics, so you cannot really understand what a QC is without understanding what QM is, and it is a difficult subject. What a present-day QC can do over a present-day conventional computer is not a lot, but that is a misleading answer. The interest in QC lies in what it could do if developed sufficiently.1Z 15:09, 12 January 2007 (UTC)

DNA computing - better to omit it?

I don't quite understand why DNA-computers are lumped into the introduction as "classical" computers. At least to my knowledge, DNA computers are anything but classical (e.g., highly parallel and probabilistic and all that). If one really wants to mention DNA-computers, a far more elaborate explanation would be necessary; the way it is phrased now, it's misleading and highly imprecise. I suggest not to mention them at all.

DNA computers are classical in the sense that the word "classical" is being used here. "Classical" is used in opposition to the word "quantum" (common usage among physicists) and DNA computers are classical in this sense. Additionally, there are parallels between DNA computing and NMR spectrography based "ensemble quantum computing" (eg. see [7]). Sigfpe 22:59, 20 March 2007 (UTC)

"Asymptotically"?

It is widely believed that if large-scale quantum computers can be built, they will be able to solve certain problems asymptotically faster than any classical computer.

I don't understand the usage of "asymptotically" in this sentence. Is the sentence claiming that the speed of classical computers can approach but never reach the speed of quantum computers at solving certain problems? If this is so, then the use of "asymptotically" is simply incorrect, because a curve can in fact possibly reach its asymptote and even cross it. I think that what is really meant is something like "incomparably".--Susurrus 04:43, 27 November 2006 (UTC)

For some problems there exists a quantum algorithm asymptotically faster than any possible classical algorithm. e.g O(n^(1/2)) quantum vs O(n) classical for search. This sentence is correct but impercise "widely believed" should be replaced with "a fact". --kaz

Sussurus' question was not addressed. The sentence says "asymptotically faster," when it should read "faster asymptotically," like kaz said but did not emphasize. I don't think Sussurus understands asymptotic analysis. —Preceding unsigned comment added by 131.118.72.137 (talk) 06:26, 15 November 2007 (UTC)

NP problems

'There is a common misconception that quantum computers can solve NP-complete problems in polynomial time.' I thought they could, can anyone explain this one? Paskari 20:09, 2 December 2006 (UTC)

It hasn't been proven one way or the other. What's there to expain? --kaz

Generally, the assumption is that an NP-complete problem can't be solved by any means other than trying every possible combination; thus it will require fully-exponential time on a classical machine.

Now the problems which can be easily solved on quantum computers also have classical algorithms that run in less than fully-exponential time (ie they use time 2^sqrt(n) or the like), so they probably aren't NP-complete. So we expect that quantum computers also require fully-exponential (or at least more than polynomial) time on NP-complete problems. Grover's algorithm is probably the only one that provides any speedup for a complete problem, and it is only quadratic, so it turns time 4^n to time 2^n, which is not really useful. Ben Standeven 05:27, 29 January 2007 (UTC)

A quadratic improvement on 2^n is 2^[n/2], not 2^[squrt(n)] and is called "subexponential". 24.77.65.236 23:53, 25 February 2007 (UTC)

This article Rocks

It is not an easy article, and I could see value in some "dumbed down" introduction to make it more accessible to people who are not well-versed in quantum mechanics or computer science.

I am a lowly biologist/biochemist who dabbles in computer programming and materials, and I find this article to be very challenging. But I like that. This article gives me lots of food for thought and points me in lots of good directions. I am very grateful to the editors of this article. And I am impressed by the civility demonstrated here in the Talk section.

Cyclopiano 09:43, 9 December 2006 (UTC)

Quantum Computers Now Exist, Update Article

I am unable to update the article but if this newspaper is correct a small Canadian company has succeeded at making a Quantum Computer.

Link to article: [8]

Sir On The Edge 23:04, 13 February 2007 (UTC)

Experimental QCs have existed for some time.1Z 23:47, 13 February 2007 (UTC)

The word commercial is all-important here.1Z 01:38, 14 February 2007 (UTC)

D-wave has stated that currently the only thing their quantum computer has over classical computers is the word quantum in it's name. Anyone want to buy a quantum bridge? 24.77.65.236 23:57, 25 February 2007 (UTC)

List of types of computers incomplete ?

How about putting a short list here and point to other computers. In the text there allready is a list. What should be added to it is the neural computers. Perhaps concepts could be added too like distributed computing

I dont mean the neural math, but the rat brains who can fly an airplane. perhaps it's a sad thing they made rat brains do that. But these days they create brain cell like other cell types. And they can be attached to electronics.

Might be a good ide to also provide some information on what type of problems each computer type is good in. Based on that then can be pointed out where a quantum computer fits in.

I think article could be made more complete by adding more type of problems a quantum computer can solve. Or what kind of math direction is used here. —The preceding unsigned comment was added by 82.217.143.153 (talk) 23:07, 14 February 2007 (UTC).

What is diferent between Shor’s faktorization algoritm and Grover’s algoritm?

Grover's algorithm is for an unstructured search. Shor's algorithm is for factoring an integer (rather order finding which is the heavy lifting in factoring). They answer different questions and cannot be compared. Shor's gives (logN)*3 running time. I assume by the quadratically better (logN)^1.5 you suggest that you wish to apply gover's on top of shor's, which cannot be done since grover's requires an Oracle. 125.14.79.216 16:44, 23 April 2007 (UTC)

Wrong. For shor's algorithm need time , while for grover algorithm time, where n is number of qubits. —Preceding unsigned comment added by Weekwhom (talkcontribs) 05:22, 10 June 2008 (UTC)

Power of quantum computers

This comment is in response to the request for help with the Featured Article review:

The section Power of quantum computers is fairly convoluted and is asking for a clean-up. The main problem seems to be the lack of focus (or structure): it wanders into certain problems with known efficient quantum algorithms, than wanders away, comes back... The "author" cannot decide whether to hail quantum computers for being able to break cryptoalgorithms or reassuring the reader that not all is lost. I think that the best way to deal with it would be to adopt the traditional order of exposition:

1. Explain Feynmann's idea (quantum simulation on classical computers).

2. Give an informal explanation of Deutsch's problem and Grover algorithm.

3. Give a very brief but clean explanation of the difficulty of integer factorization and discrete logs for classical computers; then explain Shor's breakthrough. Avoid vague statements, like the one about 300 digit numbers unless they can be backed up in the text (seems unreasonable).

4. Summarize the current (with the disclaimer) gap between the power of classical and quantum computation. The implications for cybersecurity can be mentioned there, but without repetitions and too many qualifiers. Complexity-theoretic statements need to be cleaner. If there are a disagreements about inclusions between various complexity classes, this article is not the place to settle them.

5. Defer all discussion of 'why this works?' type to the corresponding articles or a separate section.

I hope this helps! Arcfrk 08:17, 10 April 2007 (UTC)

I agree with a lot of the above. This line is particularly a problem: "It seems plausible that it will always be possible to build classical computers that have more bits than the number of qubits in the largest quantum computer." How so? I understand the premise of the statement, but why can't quantum computers advance exponentially in relation to traditional machines? They certain could. There's no reference or explanation. That statement is an example of why that section is so convoluted.128.227.98.88 16:45, 16 October 2007 (UTC)

Please move anything back into the article you feel should not have been removed or place your comments on this action here. Skippydo 10:26, 27 July 2007 (UTC)

Reference misplaced

Ref 9 in the section "Quantum decoherence" is misplaced. Ref 9 does not discuss the increase of the number of qubits needed for fault tolerance. Rather, it questions the very possibility of fault tolerant quantum computation and presents a challenge for numerical verification of the proposed procedure for storing in memory just one qubit in the presence of realistic noise. (Karamzine 10:24, 10 August 2007 (UTC))

Section 6 on page 4 of the article you refer makes reference to the Shor code and Steane code. Specifically it mentions that 7 physical qubits are required for each theoretical qubit. It might be better to reference the Shor and/or Steane papers directly. Skippydo 13:50, 10 August 2007 (UTC)

Speed on the physical level

Okay, I understand the performance differences between quantum and classical computers on the theoretical level (Big-O notation): quantum computers probably have an advantage in integer factorization and possibly other problems as well. But what about the "actual" manipulation of data? Can quantum computers (or "can future quantum computers") actually flip and store (qu)bits at a much faster rate than classical computers? It seems to me like this should be the case, but I don't know. And if it is, wouldn't it not matter if quantum computers don't have an advantage over classical ones on the algorithm level, because they could emulate classical algorithms at a higher rate of speed? Or do I just have no idea what I'm talking about? Sloverlord 14:11, 29 August 2007 (UTC)

It depends on what you mean by faster. If you're talking asymptotically, then the answer is no.
If, by faster, you mean ticks of the clock, then do you have specific implementations in mind? Transistor classical computation vs ion trap quantum computing? Or vacuum tube classical vs NMRI quantum? What version of each processor? Skippydo 20:59, 31 August 2007 (UTC)
Are the various implementations so different in their capabilities that some would be faster than classical computers and others slower? Sloverlord 17:28, 6 September 2007 (UTC)

Probabilistic?

it is not true that quantum computers just solve probabilistically...they are actually deterministic. —Preceding unsigned comment added by 83.132.216.92 (talk) 00:24, 6 November 2007 (UTC)

pictures

any pictures of a real quantum computer?--200.73.179.203 03:29, 15 September 2007 (UTC)

QCs in fiction

Zegapain should be mentioned in the QCs in fiction section --David13579 23:32, 6 November 2007 (UTC)

Also, is there any point in the following line: "In Dan Brown's novel Digital Fortress, the NSA's cypher breaking computer TRANSLTR does NOT use quantum computing to break encryption keys of encrypted E-mails, it uses 3,000,000 silicon processors."? If it doesn't use quantum computing, why is it listed here? Calmac1991 (talk) 23:27, 27 March 2008 (UTC)

Can any of this be explained in layperson's terms?

I know this is probably asking the impossible, but can the general gist of this article be made accessible to someone without university level physics? Analogies might help. The example that comes to mind of such an analogy is that of heavy ball bearings sitting on a horizontal suspended rubber sheet to illustrate the curvature of space.

A walk-through example in very simple language might be good as well, for instance: What goes into a particular quantum logic gate, what happens there and what comes out of it?

As it is, I still haven't got the slightest clue why a set of 8 qubits should hold so much more information than 8 normal bits.

I hope this request doesn't seem thoughtless or rude. I am well aware of how difficult it is to make quantum physics intelligible to anyone, let alone non-physicists. Ireneshusband (talk) 22:07, 23 January 2008 (UTC)

You will not find any analogies to the world you think you live in. This cannot be explained using layman's terms. Anyone who claims to understand "what is really going on" does not.

On your specific questions, what "goes in" to a quantum logic gate depends on the implementation. A laser acting on a photon is an example of a quantum gate (not really gate like, is it?) 8 qubits cannot "hold" any more information than 8 classical bits. I'm assuming that "hold" means we can measure and recover the bits. Some dispute this meaning.

You aren't being thoughtless and rude and I hope I haven't been either. If you aren't disturbed and a little offended by quantum physics you haven't been paying attention. Skippydo (talk) 03:33, 29 January 2008 (UTC)

Well, I don't know that it's impossible to explain things in simpler language, but it's a challenge. It's especially difficult because there's a pretty technical debate going on between editors of the page. Didn't Einstein say something along the lines of "if you can't explain it to a child, you don't understand it"? Perhaps all physicists have failed Einstein's test on the matter of quantum mechanics. But we can always try to do better. Gnixon (talk) 16:50, 4 June 2008 (UTC)

Quantum computing without entanglement

http://arxiv.org/pdf/quant-ph/0511272v1

"Some cases have been found where even without entanglement, quantum computa- tion outperforms classical computation. Collins, Kim and Holton [15] solve the Deutsch- Jozsa (DJ) [5] problem without entanglement, but only for n = 2 bits, and prove that entanglement is required for any larger n; Braunstein and Pati [16] show that using pseudo-pure states, Grover’s search problem can be solved without entanglement for n ≤ 3 bits more efficiently than classically; Lloyd [17] suggests an entanglement-free implementation of Grover’s algorithm, but with exponential spatial complexity; Biham, Brassard, Kenigsberg and Mor [18] use a non-standard computation model, with a limitation on the number of allowed queries, to prove a tiny separation for any n in the context of Deutsch-Jozsa’s and Simon’s [6] problems (with mixed states); Meyer [19] notes that Bernstein-Vazirani algorithm [20, 21] requires no entanglement, yet uses only one oracle call, while the best classical algorithm requires n oracle calls."

Deutsch-Jozsa and Simon's algorithms don't have practical interest. And Bernstein-Vazirani algorithm probably also. But where it says about quantum computer? BTW, Deutsch-Jozsa algorithm can be solved effiecently on probabilistic computer. Simon algorithm have reasonably less advantages over classical computer than exponentional. So if entanglement decrease exponentionaly with number of qubits, then it's possible to do conclusion that Simon algorithm without entanglement also don't have any advantages over classical or probabilistic computer! Also key point can be, that for any quantum computing without entanglement need classical computer and usualy quantum computer taking more gates... —Preceding unsigned comment added by Weekwhom (talkcontribs) 08:17, 3 June 2008 (UTC)

In Reqursive Berstein-Vazirani algorithm need queries classicaly and queries quantum mechanicaly (with entanglement). In Nonreqursive Berstein-Vazirani problem classicaly need n queries and quantum mechanicaly (with entanglement) 1 querie. It's possible that Bernstein-Vazirani algorithm without entanglement isn't faster than probabilistic computer.

"We investigated the advantage of quantum algorithms without entanglement over clas- sical algorithms, and showed a maximal entanglement-free subproblem of the Deutsch- Jozsa problem, which yields an O(1) to O(n) quantum advantage over the best exact classical algorithm. Due to this ban on entanglement, the exponential advantage of exact-quantum versus exact-classical is lost. For the Simon problem we showed that any non-trivial subproblem requires entanglement during the computation. Using a some- what different approach, we showed that this also holds for Grover’s algorithm. Further research on the role of entanglement in quantum information processing may illuminate some of the following questions: is there a restricted form of the Simon problem (or more generally of the hidden subgroup problem [22]), and a corresponding quantum algorithm that presents a quantum advantage without entanglement? Is there a subprob- lem larger than DJ⊗ and a corresponding algorithm (not the DJ algorithm) that solves it without entanglement, and yet has an advantage over any classical algorithm? Can there be a non-negligible advantage of QCWE over classical computation when separable mixed states are used? Can it be proved that exponential advantage of exact QCWE over classical-exact computation is impossible in oracle-based settings? Note that this is not known yet, even for the Deutsch-Jozsa problem, since there may be some other quantum procedure for which the QCWE advantage holds, and the subset is larger than F⊗DJ."

-It's unclear for simon algorithm about advantage without entanglement.


However, a sharp criticism has been proposed by Braunstein et al. that NMR experiments have not actually realized quantum algorithm because at each time step the state of the system can be described as a probabilistic ensemble of unentangled quantum states. On the other hand, some scientists believe that for a specific quantum algorithm the power of quantum computer derives from quantum superposition and parallelism, other than entanglement.

Quantum parallelism, I think, here means the randomly entangled qubits.
Stincky doubt about faster than classical Bernstein-Vazirani algorithm without entanglemen:

"We have described a classical optical scheme to implement the Bernstein-Vazirani algorithm. This scheme is entirely classical as we have used only ’classical qubits’ (based on the polarization states of light beams), and passive optical elements such as detectors, beam splitters, phase shifters and mirrors. The number of components needed to implement the algorithm increases linearly with the number of input beams. We have explicitly cloned the input and interfered it again with the part which undergoes the oracle unitary, in order to solve the Bernstein-Vazirani problem. This scheme does not require the implementation of any Hadamard gates. We have also shown through our interference arrangement that we can use the same oracle to compute f(x) for a given x. We believe that this analysis is a step in the direction where information processors based on interference of waves are analyzed in detail for their computation power. These systems seem to provide a model that is in-between the classical computation model based on bits and a fully quantum computer. The computational power is also likely to be in-between the two models (these issues will be discussed elsewhere)." http://arxiv.org/pdf/quant-ph/0605092

So where claims about faster quantum computation without entanglement? Where prove?


For Simon's algorithm need entanglement:

We conclude that the usage of the Simon algorithm for any subproblem of the Simon problem requires entanglement. However, an interesting open question in this context, is whether there is a restricted version of the problem solvable by some other quantum algorithm without entanglement, and achieves an advantage over the classical case. —Preceding unsigned comment added by Weekwhom (talkcontribs) 10:36, 3 June 2008 (UTC)

You yourself quote the following a maximal entanglement-free subproblem of the Deutsch- Jozsa problem, which yields an O(1) to O(n) quantum advantage over the best exact classical algorithm
Here's another: The quantum advantage that we have found is negligible (exponentially small). [9]
I assume this in response to this piece of the article:

Some computing architectures such as optical computers [3] may use classical superposition of electromagnetic waves. Without some specifically quantum mechanical resources such as entanglement, they have been shown to obtain no advantage over classical computers, entanglement is nessasary for exponential (or quadratic) advantage theoretically obtainable by quantum computer.

Would you rather:

Some computing architectures such as optical computers [3] may use classical superposition of electromagnetic waves. Without some specifically quantum mechanical resources such as entanglement, they have been shown to obtain only a negligible advantage over classical computers in a few cases, not the exponential (or quadratic) advantage theoretically obtainable by true quantum computers.

Skippydo (talk) 13:12, 3 June 2008 (UTC)


The quantum advantage that we have found is negligible (exponentially small). This is true for comparing determinist computer with quantum computer without entanglement, but probabilistic computer is faster or the same speed as quantum computer without entanglement in Deutsch-Jozsa problem. So your assumption is bad. P.S. Even probabilistic computer is faster than quantum computer with entanglement, becouse of error correction in quantum computer... —Preceding unsigned comment added by Weekwhom (talkcontribs) 15:01, 3 June 2008 (UTC)


About Grover algorithm without entanglement (and even without mixed entanglement): It is well known that this unary mapping comes at the cost of an exponential overhead in some physical resource. —Preceding unsigned comment added by Weekwhom (talkcontribs) 15:03, 3 June 2008 (UTC)

On your first comment, now, I'm not sure what you mean by faster. If you wish to compare actual runtime in second, you are comparing apples and oranges. When we compare algorithms, we generally speak asymptotically, in which case the quantum implementation enjoys a negligible advantage: Using a separable (that is, unentangled) state, we show that the Deutsch–Jozsa problem and the Simon problem can be solved more reliably by a quantum computer than by the best possible classical algorithm, even probabilistic. [10] Skippydo (talk) 15:21, 3 June 2008 (UTC)

But when number of qubits increases in qunautum computer without entanglement increases advantages decreasing exponentionaly. And probably:

  • "The quantum advantage that we have found is negligible (exponentially small).

A much better advantage might be obtained by increasing � and investigating the separability of the speci;c states obtained throughout the unitary evolution of the algorithms.

  • The case of more than one query is left for future research." [11]
It is only for one querie. Don't you think that it is too unimportant? And it's scienciest I think comparing apples with oranges, becouse they don't quanting number of gates, etc, and that quantum computer must be controled with classical computer! Weekwhom


On your first point, you seems to be agreeing that quantum without entanglement is better than classical.
On your second point, I'm not sure what more than one query is meant to mean. We ask if a quantum computer enjoys an advantage over classical. The answer is yes. Do you not agree?
The mathematical formulation of a quantum computer is a Hillbert Space and Unary Transformations. There is no mention of classical computers. Likely, in an implementation some are need, but this is independent of the results.
Regardless, a quantum computer (with or without entanglement) can always run in classic mode. This means that a quantum computer is never inferior to a classical computer. Skippydo (talk) 15:18, 4 June 2008 (UTC)

After one querie in detusch-jozsa algorithm they both giving not good answer... But quantum computer without entanglement have a little bit bigger probability to give correct answer than probabilistic computer, I guess... But it's experimentaly wasn't tested, I think. And there still may be need more operations on quantum computer without entanglement than on probabilistic computer.

For probabilistic computer don't need stupid schemes, becouse answer still will be probabilistic, so for probabilistic computer need less gates than for quantum computer without entanglement and I think it can explain they missunderstanding that quantum computer without entanglement is faster than probabilistic computer (in deutsch-jozsa problem). —Preceding unsigned comment added by Weekwhom (talkcontribs) 05:33, 5 June 2008 (UTC)

The Deutsch-Josza algorithm is deterministic.
I don't understand what you mean by probabilistic computer need less gates than for quantum computer without entanglement. Consider a probabilistic classical algorithm and the Deutsch-Josza algorithm both being run on a quantum computer without entanglement. Asymptotically, the Deutsch-Josza algorithm wins. Skippydo (talk) 15:30, 5 June 2008 (UTC)

Consider a probabilistic classical algorithm and the Deutsch-Josza algorithm both being run on a quantum computer without entanglement. Asymptotically, the Deutsch-Josza algorithm wins. With more gates. Don't you see? Deutsch-Jozsa algorithm working with function, which taking some amount of gates, while for probabilistic computer don't need to care about any function it just randomly generates any bits and those bits is answer... OR may not... Sory, I spent somthing... But advantages of quantum computer is only after one querie so it's don't matter so much... And for example enetanglement are good only with photons and photons taking many space... But possible also that deutsch-jozsa algorithm requiring more operations than in probabilistic computer solving this problem case. I am only familiar with Deutsch-Jozsa algorithm with entanglement and can tell you that Deutsch-Jozsa algorithm with entanglement isn't faster than classical computer in the same problem? Why? There is very strong asumption for this. First of all. Classical computer don't using aditional Hadamard gates! Also for classical computer don't need error correction, which increasing number of gates up to 1000 times (according to some optimistic theoretical expectations of possibility of quantum errors correction). But probabilistic computer can give you correct answer after n queries and fail probability is , while for quantum algorithm need only 1 querie (and more gates for error correction and hadamard gates) with certainty. But what is mean 100%? It means, that if error correction don't have fail probability, but it actualy have theoreticaly about , thus after 15 trys probabilistic computer will solve deutsch jozsa problem as succesfuly as quantum computer with entanglement. After two queries of quantum computer error correction fail probability is , thus need 30 queries of probabilistic computer to get the same probability. But quantum computer cirquit is more than 15 times complex and thus everything is equal and even probabilistic computer doing better. Thus Deutsch jozsa algorithm with entanglement and error correction isn't faster than probabilistic computer. So I am now searching weak point of Deutsch-Jozsa algorithm without entanglement over probabilistic computer, but since I don't understand Deutsch algorithm without entanglement, then is up to you... But I am thinking, thut deutsch-jozsa algorithm quantum computer without entanglement cirquit may be more complex than probabilistic computer cirquit and then exponentionaly small advantage don't playing any role and they are equal. And by the way, Deutsch-Jozsa algorithm don't have practical tasks... —Preceding unsigned comment added by Weekwhom (talkcontribs) 18:04, 5 June 2008 (UTC)

Gate complexity is polynomial (in fact linear) with respect to the DJ algorithm. Gate complexity doesn't effect speed. In the mathematical model of quantum computing we are assuming, no error correction is required. Skippydo (talk) 18:47, 5 June 2008 (UTC)
But required more gates, becouse Deutsch-Jozsa algorithm with entanglement using about 2n hadamard gates, where n is number of qubits and all over size of cirquit is the same. Hadamrd gate is prety dificult operation I guess, so possible that even in mathematical model Deutsch -Jozsa algorithm isn't faster than probabilistic computer (becouse everything is the same, except that qunatum computer using aditional 2n hadamard gates). Sciencists seems don't counting hadamrd gates into quantum computer complexity or if I am wrong, then hadamrd gates are very simple and negligable operation, which can be ignored - I am not sure about that... But I just can say, that roughly for Deutsch jozsa algorithm need about <n/2 C-NOT gates and about <n/2 NOT gates. But again, quantum CNOT gate can be more complex than classical XOR gate and quantum NOT gate can be more complex than classical NOT gate. There can be that sciecists comparing apples with oranges and then getting they polinomial speedup over probabilistic computer and exponentional speedup over classical computer (but not always...). —Preceding unsigned comment added by Weekwhom (talkcontribs) 05:40, 6 June 2008 (UTC)


Some cases have been found where even without entanglement, quantum computation outperforms classical computation. Collins, Kim and Holton solve the Deutsch-Jozsa (DJ) problem without entanglement, but only for n = 2 bits, and prove that entanglement is required for any larger n;

You see, Deutsch-Jozsa algorithm is faster only for case when number of qubits is 2 and only after one querie! Isn't it don't looks very not important? —Preceding unsigned comment added by Weekwhom (talkcontribs) 05:47, 6 June 2008 (UTC)
From the same paper seems, that Deutsch algorithm without entanglement using Hadamard gates and combining it with exponentionaly small advatages and with fact that probabilistic (or classical) computer don't using additional 3 Hadamard gates, there possible to made conclusion, that Deutsch algorithm without entanglement isn't faster than probabilistic or classical computer, becouse have 3 gates more than probabilistic computer. This exponentionaly small advatnage can compensate fact, that classical computer don't using hadamrd gates! —Preceding unsigned comment added by Weekwhom (talkcontribs) 05:56, 6 June 2008 (UTC)

I think you are right in mathematical model quantum computer with entanglement with Deutsch-Jozsa algorithm is faster than probabilistic computer, becouse even if deutsch-Jozsa algorithm using 4 times more gates than probabilistic computer or even 100 times more gates it still don't matter, becouse probabilistic computer will fail with probability , while quantum computer fail probability is 0. But fail probability is so small that faster come end of univerese and everything will explode and thus quantum computer also and thus will not be advantages of quantum computer... Weekwhom

Notice that it's query complexity we compare in the Deutsch-Jozsa algorithm, fixing probability and time. From the paper you cite: Note also that here the quantum-to-classical gap in the exact case diminishes from O(2^n ):O(1) to O(n):O(1). This is the price we pay for not using entanglement. Constrasting this with what you have quoted, I'm a little confused Skippydo (talk) 12:59, 6 June 2008 (UTC)

I actualy don't understand understand this qoute very good "...diminishes from :O(1) to O(n):O(1). This is the price we pay for not using entanglement."; According to my understanding quantum computer (with entanglement) is only sometimes exponentionaly faster and sometimes polinomialy, but there probably talking about exponentional speedup in quantum computer and polinomial inDeutsch-Jozsa algorithm without entanglement, but I am also cofused, that in over place there says, that entanglement is required for more than 2 qubits in Deutsch-jozsa algorithm without entanglement. There is some don't maching in those papers, one says that advantage is exponentionaly small and after one querie over don't saying nothing about queries... And I don't find more papers than those two, maybe those papers are rubish? Sometimes on arxiv.org there is wrong papers. —Preceding unsigned comment added by Weekwhom (talkcontribs) 17:45, 6 June 2008 (UTC)

Criticisms?

In most Wikipedia articles about potential/bleeding edge technologies, there are criticism sections about the revolutionary claims. Does this page need to list criticisms of the idea of quantum computing? Are there any valid criticisms of quantum computing or quantum computing theory or w/e? Nave.notnilc (talk) 01:44, 19 January 2009 (UTC)

Well, I have some criticisms about how quantum computing is typically described... I don't want to risk ranting about it too much but I will say one thing. Qubits are typically said to be in two states simultaneously - fair enough, but we must be careful to understand that this is just a way of describing the weirdness of quantum mechanics. The qubits aren't really in two states, it's just at the state they are in might not be an eigenstate of the computational basis and so we write it as a superposition. It is still just one state. Here's a quote from the article: "In general a quantum computer with n qubits can be in an arbitrary superposition of up to 2n different states simultaneously (this compares to a normal computer that can only be in one of these 2n states at any one time)." This is a common way of describing what is going on but it can be a bit misleading. The system of n qubits is still only in 1 state at a time, but to properly describe that state we often write it as a 'superposition' of states in the computational basis (the '0 and 1' basis). The same state could just as well be written in some other basis for which it will not be a superposition. When we do our quantum calculation with one of these superposition states as an input, we aren't really computing on a bunch of states at once, we are just computing on a single state which happens to be written as a superposition.

It's a kind of subtle point and I don't trust that I'm able to articulating it properly, so I'm not going to change the article. But I'd encourage someone else to do so. --Karadoc** (talk) 03:13, 13 August 2009 (UTC)

SCIAM - Quantum Leap: Information Teleported between Ions at a Distance

January 22, 2009

SCIAM - Quantum Leap: Information Teleported between Ions at a Distance

Link between spatially separated ions could form the basis of quantum communications

By John Matson

[snip]

A group of researchers, however, report today in Science that they've made headway in quantum teleportation, and thus communication. The team, led by physics graduate student Steven Olmschenk at the University of Maryland, College Park, succeeded in teleporting quantum information between ytterbium ions (charged atoms) three feet (one meter) apart.

[/snip]

http://www.sciam.com/article.cfm?id=quantum-teleportation-with-ions

jwalling (talk) 23:40, 22 January 2009 (UTC)

Quantic Noise And Quantum Computers

Couldn't quantum computers be made using classical electrical circuits with inputs with very high quantic noise? Whigh quantic noise would mean the input would be in a great number of possible states at the same time. This would then propagate to the remaining circuit, that could act as a filter that would select only the states that satisfied a given condition (SAT). This could be useful not only in cryptography but also protein folding and docking witch is basically a search for the lowest energy state.

fork

Could someone please decide what is the appropriate treatment for quantum parallelism. — RHaworth (Talk | contribs) 05:54, 9 February 2009 (UTC)

Are you asking, should quantum parallelism link to quantum computer? Skippydo (talk) 18:06, 9 February 2009 (UTC)

I am asking someone familiar with quantum physics to look at the previous states (note that the long list of refs is copy&pasted from quantum mechanics). Note also the comment "this page was created … as a joke". Then decide what should exist on that title. Do not let my redirect influence your judgement. — RHaworth (Talk | contribs) 06:47, 10 February 2009 (UTC)

The redirect QC should remain. The previous content was nonsense. Quantum parallelism likely does not deserve it's own article. Skippydo (talk) 19:06, 10 February 2009 (UTC)

Incomplete phrases in "Potential" section

  • Quantum register: A scalable physical system with well characterized [???] representing qubits that in turn compose the quantum register.
  • Initialization: The ability to prepare the state of the register in a initial state.
  • Universal set of gates: The ability to implement a universal set of logic gates.
  • Low error and decoherence rate: High fidelity of gate operation, with probability per gate [???] < 10−3 and qubit decoherence times that are longer than the gate operation time.
  • Read-out: The ability to reliably measure the state of individual qubits [???] the computational basis.

The [???] represent what appear to be missing words:

  • Well characterized whats representing qubits?
  • Probability per gate of what < 10-3?
  • For the third one, I'm not entirely sure what's missing, but it's not a sensible phrase as it stands.

Because I don't know what's supposed to be here, and there's no citation I can follow to see the source that's being summarized, I can't fix it. Hopefully the original author or someone else can. --76.216.10.112 (talk) 09:06, 1 March 2009 (UTC)

I've simply removed the offending content. Skippydo (talk) 19:21, 1 March 2009 (UTC)

Genome evolution analogous to quantum computer

Can someone more familiar with the subject please review the section Genome evolution analogous to quantum computer. I have tagged it as Original research since no citations are given and the embedded links do not show any of this behaviour described in the section. HumphreyW (talk) 03:44, 9 April 2009 (UTC)

This concept is based on solid research finding published in refereed journals. Gene conversion does act to cohere different gene alleles and interspersed repetitive DNA does de-cohere. The analogy with Quantum Computing is just that.....an interesting similarity. The inclusion of this suggestion is meant to stimulate cross-disciplinary thinking. Crunkcar (talk) 13:47, 9 April 2009 (UTC)

Thanks for explaining some more. Is the use of the terms 'cohere' and 'de-cohere' the same meaning as used when discussing QC coherence/decoherence? I can't see how this relates to QCs for the purpose of computing. HumphreyW (talk) 14:15, 9 April 2009 (UTC)

Gene conversion occurs primarily during meiosis. As you know, all chromosomes are present in pairs and the genes residing on those chromosomes are also present in pairs. Gene conversion compares these sister DNA sequences. When variations are found between DNA sequences, they are homogenized, meaning that one of the variant sequences (called an allele) is corrected to be identical to the other. Gene conversion and sexual reproduction act as a coherence mechanism forcing all the alleles within a species to conform to the same sequence. This is a good thing for maintaining the integrity of a genome within a species, but the point of evolution is to evolve new species, something that will not happen if every variation arising is removed by gene conversion. To get around this, genomes employ de-coherence mechanisms. These are repetitive DNA transposons that can randomly insert around the genome. Their insertion creates non-homologies between sister DNA sequences that suppress gene conversion events in the neighborhood where they insert. In other words, gene conversion mechanism require extended DNA homologies, which are interrupted by transposons. By suppressing gene conversion, alleles de-cohere and evolution of new genes is allowed. Again..... the DNA side of this analogy is documented by peer-reviewed literature. As far as its application to quantum computing.... my only wish is to stimulate new connections for the readers. Crunkcar (talk) 14:51, 9 April 2009 (UTC)

Okay, but none of that says anything about it being related to 'Quantum' Coherence. Explaining about genes and alleles is all very well but you still have not shown how any of it relates to QC. The word cohere has more meanings than one and I think you are confusing the word 'cohere' above to mean that it is somehow involved with Quantum coherence. Please explain how all the above is related to QC. Thanks. HumphreyW (talk) 23:06, 9 April 2009 (UTC)
This section does not belong in this article. "Decoherence" is being used in a completely different sense. This is not quantum information processing (although, it is, of course, computation). Perhaps this could be added to DNA computer? Or Computational gene or Biocomputer? ---- CharlesGillingham (talk) 01:15, 10 April 2009 (UTC)
Until the relevance of the section can be properly determined I have placed the section within HTML comment tags. HumphreyW (talk) 07:07, 10 April 2009 (UTC)

At first glance, knowing nothing about DNA, I would guess that genes do not have phase. I could be wrong. Without an acceptable citation, this section should not be included anywhere. Skippydo (talk) 15:49, 10 April 2009 (UTC)

Article May Be Too Advanced for the Lay Reader, Consider using examples, and metaphor

Wikipedia is not a science journal. It is a public use online encyclopedia and therefore all scientific terminology or explication should include explanations that are clearly understandible to the average non-scientist reader. Sean7phil (talk) 19:05, 19 May 2009 (UTC)

I agree this is about the most confusing article I have ever read, i hope no-one gets too upset i added a few citation needed tags to this DarkShroom (talk) 23:45, 28 February 2010 (UTC)

Extremely thorough in research, the bulk of the article is written in language which may be too sophisticated for the lay reader. Particularly sections, 2 and 4.1. What metaphors, examples or summaries could be used to make this article clearer for the lay reader? --Jonathon212 (talk) 20:11, 24 June 2010 (UTC)

First electronic quantum processor!

Can someone (with more knowledge in these matters) include information about this breaking news? Here are various links to the same article:

This morning I was reflecting on the possibilities of a computer that can tell us that '2 + 2 = 4' with greater than 50% probability. ;-) —RJH (talk) 21:57, 16 November 2009 (UTC)

A non-technical introduction to QC

It seems that many editors feel there should be a another article giving an accessible, non-technical introduction to the subject, much like Introduction to quantum mechanics is such an article for quantum mechanics, and Introduction to special relativity is for special relativity.

Personally, I'm not sure how well this can be done. How non-technical should it be? For that matter, I feel that this article itself is quite non-technical. If we do have another article for a non-technical introduction, then maybe this article can have more math, making this a good introduction to someone who understands basic concepts of linear algebra, logic gates and probability theory. --Robin (talk) 18:24, 10 July 2009 (UTC)

I can't really make a suggestion on how to do it because this article already makes my head explode (as does the one about quantum superposition), but I'd like to second the notion that it should be done. Quantum computing seems like it will be an important concept in the coming years, so it would be nice if non-physicists could get their hands on an explanation. -- Zacqary Adam Green (talk) 08:25, 1 August 2009 (UTC)
For one thing, the article sorely lacks just a basic 'Applications' section to at least point what the stuff can be used for. --87.53.12.76 (talk) 15:32, 7 September 2009 (UTC)
I found it difficult to tell what QC exactly was and what it did from a glance. I suppose devoting an hour or so would solve my problem, but I was looking up the term to understand another text which time consuming enough. I appreciate the detail wikipedia editors and writers go to but there should be some "long story short" section. A paragraph at the top of the page explaining how a QC is different from a normal computer (in function, constraints, limits, efficiency and results, not operation mechanics, although naturally the former is influenced by the latter) would be wonderful.
I support the idea of applications as well, and especially some examples along the lines of "Simple problem X will take the same amount on QC or Trad. C to solve. Hard problem Y will be slightly faster on a QC because of QC Property A, but also more difficult to implement because of QC Implementation Hurdle Alpha. Interesting problem Z is substantially faster in QC due to property B, despite implementation problem beta. Weird problem W is actually easier to implement on QC because of implementation principle theta."
Of course, the actual relations are all random since I have no idea how QC compares to other computing and how it is different. As another example, if the article was about thermodynamic, I would expect it to say somewhere "In conclusion, matter and/or energy cannot be created from scratch and any energy transition is less efficient than 1 (and the efficiency is limited to that of a Carnot engine)." --Ar-Pharazôn (talk) 17:28, 28 November 2009 (UTC)

Such a technical article may not be appropriate, though extremely well researched. Jonathon212 (talk) 20:15, 24 June 2010 (UTC)

I saw Quantum Computing mentioned in the newspaper and came here to learn about it. Unfortunately I learned very little - the article is way beyond this layman (with a masters degree in Mathematics). --Brian Fenton (talk) 21:25, 15 October 2010 (UTC)
I found this article http://cryptome.org/qc-grover.htm very helpful in explaining the concept. --Brian Fenton (talk) 21:00, 16 October 2010 (UTC)

Problems and Limitations

Shouldn't article have a generic "problems and limitations" section? Gravity-induced decoherence for example can be a fundamental limitation. Fundamental "quantum limit on computing speed is another one. --Dc987 (talk) 22:02, 14 December 2009 (UTC)

Tertiary quantum computer

Perhaps the Ternary quantum computer can be mentioned ? See http://nextbigfuture.com/2009/08/qudits-multilevel-versions-of-qubits.html —Preceding unsigned comment added by 81.243.191.103 (talk) 12:38, 6 January 2010 (UTC)

Automate archiving?

Does anyone object to me setting up automatic archiving for this page using MiszaBot? Unless otherwise agreed, I would set it to archive threads that have been inactive for 30 days and keep at least ten threads.--Oneiros (talk) 21:06, 23 January 2010 (UTC)

Sounds good. --Robin (talk) 22:18, 28 January 2010 (UTC)
 Done--Oneiros (talk) 11:34, 18 February 2010 (UTC)

Production-ready quantum computers

Are there any production ready quantum computers and what are they being used for ? Amazon claim the http://aws.typepad.com/aws/2010/04/introducing-qc2-the-quantum-compute-cloud.html is actually a quantum computer any one with info ? Kendirangu (talk) 06:12, 12 April 2010 (UTC)

This was an April Fool's joke, as is mentioned at the bottom of the article. 137.149.156.4 (talk) 13:22, 14 July 2010 (UTC)

what ***** does this mean

can anybody in earth-human terms explain this matter...

yeah I agree. this article has plenty of good information in it but is poorly organized and explained. I understand that more advanced articles such as the one on topological quantum computation cannot accurately be brought to the level of the layperson, but this article needs a lot of work. — Preceding unsigned comment added by 208.92.16.72 (talk) 16:32, 9 June 2011 (UTC)

Centre for Quantum Photonics?

It was mentioned in [this article] by Hot Hardware. I believe this could be expanded in respect to this Quantum computer article. Komitsuki (talk) 14:03, 22 September 2010 (UTC)

Article Failure

The reason someone looks up "quantum computer" in an encyclopedia is not because they are already quantum physicists. I couldn't even find a reasonably phrased statement on how much faster quantum computers would be than deterministic computers of comparable size. This article reads like it was written either by people who want to show off, or people who are so oblivious that they are completely incapable of translating technical language into plain language. The level of technical language in this article is inappropriate for an encyclopedia. Could someone who has: (a) knowledge of quantum computers, AND (b) an ability to speak and interact socially with normal human beings, and (c) no desire to prove to the world how intelligent they are, please rewrite this article successfully? Thanks. --Daniel (talk) 22:45, 10 August 2011 (UTC)

i just wanted to know what "2n/2 invocations of the underlying cryptographic algorithm" means. I think there should be a link or explanation, not just a casual use of this phrase then on to the next thing. skank-L juice 00:55, 28 September 2010 (UTC) aka 96.245.96.95 (talk) 00:52, 28 September 2010 (UTC)

What is confusing, the use of 2n/2, the phrase "invocations of the underlying cryptographic algorithm", or the whole thing? —UncleDouggie (talk) 05:34, 28 September 2010 (UTC)
omg... look, if a supposed date dont get it, its too complex for an introductionary article. If in doubt, go have a date and say "2n/2 invocations of the underlying cryptographic algorithm" and check her facial expresion. You can also try it with granny. —Preceding unsigned comment added by 85.226.3.146 (talk) 20:13, 11 November 2010 (UTC)
Yo mom, you know about "2n/2 invocations of the underlying cryptographic algorithm"? You didnt? o.0 —Preceding unsigned comment added by 85.226.3.146 (talk) 20:39, 11 November 2010 (UTC)
well, I'll only be satisfied if I try this line on Jimbo Wales. then I'll know if it's good pedia content or.. well, I'll know what kind of lay-person intelligibility I'm looking at. 96.245.96.95 (talk) 14:47, 4 July 2011 (UTC)

I cannot see any compelling reason to have the second paragraph in the section "Potential". It blows up the article and is off-topic in this section. It also sounds like an advertisement to post-quantum cryptography and should be the content of the individual article about PQC. The first paragraph already states that quantum computers allow to "decrypt many of the cryptographic systems in use today",... —Preceding unsigned comment added by 130.83.224.120 (talk) 11:56, 18 October 2010 (UTC)

universal gate set

This article uses the phrase "universal gate set". What does that mean? And how is this related to the "universal quantum computation" mentioned in the Toffoli gate article? Does that phrase mean something like "any set of gates that, given sufficient numbers of them, can be cascaded to form a Turing-complete machine."? Or is there something more to it than that?

Is there some other article that describes "computational universality" and "universal gate set" (and the distinction, if any, between them) in more detail? And perhaps lists a few examples? --68.0.124.33 (talk) 04:54, 8 February 2011 (UTC)

If I recall correctly, it denotes a set such that an arbitrary gate can be approximated by a sequence of gates from that set. Since there is no memory/state, I don't think it is directly related to Turing-completeness. Skippydo (talk) 03:43, 10 February 2011 (UTC)

Double Negative

Opening contains: "However, there is up until now no mathematical proof that classical algorithms that are as good as quantum algorithms cannot be found (see Quantum complexity theory)." The double negative "no mathematical proof...cannot be found" is confusing, I'd fix it but I don't understand what's trying to be said well enough to clarify. Suggestion: "However, there may be classical algorithms that are as effective as quantum algorithms that have not yet been found (see Quantum complexity theory)." But I'm not sure if this is accurate, could someone verify that it is or suggest something better? Branwen4724 (talk) 17:02, 26 March 2011 (UTC)

It doesn't appear confusing to me. So far, no one has been able to mathematically prove that fundamentally quantum algorithms are better than classical algorithms. You have to keep the "no mathematical proof" part else you lose the reasoning. HumphreyW (talk) 23:44, 26 March 2011 (UTC)
It's certainly a clumsy sentence. I've reworded it a bit. Skippydo (talk) 01:18, 27 March 2011 (UTC)
I undid what you changed. You said "Simon's algorithm, which run exponentially faster than any possible probabilistic classical algorithm.". This is not proven. I can find no reference to show that all possible classical algorithms proven exponentially slower, and I doubt I ever will find such a reference because no one can ever know all possible future algorithms.
But we should also keep the statement about "no mathematical proof", it is important. HumphreyW (talk) 01:56, 27 March 2011 (UTC)
concerning Simon's algorithm: it is (provably) faster than any classical one given the "quantum oracle" . To my knowledge, is not known to be efficiently implementable on a quantum computer.
Concerning the confusing sentence: would it be clearer to write?
However, it has not been proven until now that classical algorithms which are as fast quantum algorithms do not exist. In particular, it is still an open question if quantum mechanics can be efficiently simulated on a quantum computer. Ineffcient simulation is possible and therefore all problems solvable with a quantum computer can also be solved using a classical computer.[3]
--Qcomp (talk) 09:49, 27 March 2011 (UTC)
For me that proposal is less clear than the original. And it leaves out the "no mathematical proof" part. HumphreyW (talk) 10:14, 27 March 2011 (UTC)
ok. I think "proven" could be replaced by "mathematically proven" (although I think they are synonymous ;-); but if it's not clearer then let's keep on looking for something better;
the whole business of "faster" is anyway more complicated, since there are many different notions of speed (even in complexity theory): QC is known to be exponentially faster relative to certain oracles (Simon) and with respect to communication complexity (Raz) and to be polynomially faster in the sense of query complexity (Grover). But of course the real potential is a super-polynomial computational speedup without oracles, and for this there is no proof. I guess it might make sense to expand the section Quantum computer#Potential is this direction. --Qcomp (talk) 10:42, 27 March 2011 (UTC)
I've restored my edit with a reference. The contentious statement appears in the abstract of the cited paper.
On a side note, we prove non-existence results about mathematics all the time. Consider, for instance, the halting problem. As a more tenable example, consider comparison-based searching of a sorted list of n items. The index contains logn bits of entropy, the comparison operator gives only a single bit. Hence, a comparison-based search requires logn queries to the list. I hope this explanation helps! Skippydo (talk) 01:43, 28 March 2011 (UTC)
I am unable to verify your reference since the site requires authorisation (401 error). Can you find another site that does not place such restrictions? HumphreyW (talk) 01:59, 28 March 2011 (UTC)
Try searching on scholar.google.com. I don't want to link directly to the pdf, just in case the link is location or session specific. Cheers! Skippydo (talk) 02:54, 28 March 2011 (UTC)
But we can't put cites into WP pages that require authorisation. Also we can't tell the reader to go and search on an external website. Both of these things fail to meet the reliable source requirements. HumphreyW (talk) 02:58, 28 March 2011 (UTC)
an alternative reference ich chapter 6 of John Preskill's Lecture Notes: John Preskill. "Quantum Computation". p. 43. {{cite web}}: Unknown parameter |access date= ignored (|access-date= suggested) (help)
in my opinion, the formulation "which run exponentially faster than any possible probabilistic classical algorithm" is misleading, since it leaves out the fact that the exponential separation is only relative to an oracle and it is not know whether the oracle needed for Simon's problem can be implemented efficiently --Qcomp (talk) 08:55, 28 March 2011 (UTC)
The reference is in the journal, not the website. Skippydo (talk) 14:40, 28 March 2011 (UTC)

Can't Solve the Halting Problem!

The article was recently changed to say "Ignoring computational and space constraints, it is not known if a quantum computer is capable of solving problems which classical computers cannot." Of course, this is wrong as a quantum computer can be simulated on a classical machine given enough time and space. Recommend we revert back to what it previously said: " If one disregards the question of efficiency (i.e., given enough time and resources) all problems solvable with a quantum computer can also be solved using a traditional computer." —Preceding unsigned comment added by 75.15.129.234 (talk) 20:46, 1 April 2011 (UTC)

You are correct. In accidentally negated one of the statements in an earlier copyedit. I looked up the reference just to be sure, and indeed, what you say is true. Skippydo (talk) 19:31, 2 April 2011 (UTC)

Bad Citation

Citation number 4 is unclear. I know nothing about this and can't follow the citations. Someone who knows something help?Darrell Wheeler (talk) 14:23, 9 April 2011 (UTC)

The book is very well-known in the field. I've given a more helpful citation. Perhaps you'll find this helpful. Skippydo (talk) 19:37, 11 April 2011 (UTC)

"Computer" versus "Computing"

It would seem that an article on quantum /computers/ would give some details on the sort of hardware that is either available, or which theoretically needs to be available, in order to construct a functional "quantum computer." Yet I don't seem to be finding any information like this in the article. Can quantum computers be built on electronic circuitboards using doped silicon chips? Do they require linear particle accelerators or massive installations like the Large Hadron Collider? Can they be made with Tinker Toys(tm) and Legos(tm). Or what?

In fact, this article is about theories of quantum /computing/, and not about quantum /computers/, per se. I suggest changing the title accordingly, or else furnishing the requisite hardware information. —Preceding unsigned comment added by 74.92.174.105 (talk) 00:20, 4 May 2011 (UTC)

  • I agree. This article is full of gibberish to someone with a CSEE background, let alone a lay reader. It is in dire need of a better introduction and more real-world examples. This is an encyclopedia, not a PhD fellowship research reference. 67.188.136.186 (talk) 23:16, 19 January 2012 (UTC)

Horrendous & Impossible to read

The basic idea behind this (?) is that regular bits are either OFF (A) or ON (B). And according to my layperson understanding, this qubit has three states (A), (B), and (AB) sort of like blood types. But this article is impossible for a layperson to get basic information like that readily. And I think somewhere is a connection to basic particles of protons (+1), neutrons (0), and electrons (-1).... because those also have 3 states. — Preceding unsigned comment added by 67.244.121.144 (talk) 15:22, 17 June 2011 (UTC)

Retrocausality in quantum computing

I know somewhere I read about retrocausal quantum computer theories; where computations would be completed before they were initiated. Why isn't there any linkage to literature on that? 74.209.54.156 (talk) 10:04, 21 July 2011 (UTC)

"One of the more bizarre properties of QCs was identified by a team at the University of Illinois at Urbana-Champaign. They presented the first demonstration of "counterfactual computation", inferring information about an answer, even though the computer itself did not run" [15] 66.243.213.232 (talk) 01:02, 17 September 2011 (UTC)

this is copied from a book

comparing the text beginning at the first chapter of Quantum Computing by Ximena Byrnes, a 2011 online book with ISBN 978-93-81157-91-6, the text on wikipedia is an almost exact copy. but there is no mention of the book in the sources.

link to book, probably wont work:

http://reader.eblib.com.ezproxy.library.wisc.edu/%28S%28ir3chenla3d1fdhoirwzpvxv%29%29/Reader.aspx?p=735741&o=691&u=UW555H383&t=1317245875&h=703F320C432AF3360021BC30CA61E25A417B03D7&s=10849733&ut=2121&pg=1&r=img&c=-1&pat=n# — Preceding unsigned comment added by 128.104.0.87 (talk) 21:47, 28 September 2011 (UTC)

The introduction to this article predates the publishing of the book by a wide margin. The appropriate question is does the book cite wikipedia. Feel free to paste the offending portion of the book here and we can have a laugh about it. Skippydo (talk) 22:12, 28 September 2011 (UTC)


Yeah, the book should cite wikipedia, not the other way around. But I don't see any citations.

Here is the first five pages of the ebook. Except from the images taken from wikipedia, the book is a plain text document with bolded and enlarged text headings. Looks like something made quickly in microsoft word. And I don't see any citations anywhere.

<<page 1 of 101>>

[the cover of the book, blue with a double helix on top of a keyboard.]


<<page 2 of 101>>

First Edition, 2011

ISBN 978-93-81157-91-6

© All rights reserved. Published by: The English Press 4735/22 Prakashdeep Bldg, Ansari Road, Darya Ganj, Delhi - 110002 Email: info@wtbooks.com

<<this part is sideways on the left margin of every page, so i'll delete it after this>> © Byrnes, Ximena, May 01, 2011, Quantum Computing The English Press, New Delhi, ISBN: 9789381157916



<<page 3 of 101>>

Table of Contents Chapter 1- Introduction to Quantum Computing Chapter 2 - Quantum Superposition Chapter 3 - Quantum Entanglement Chapter 4 - Superconductivity Chapter 5 - Quantum Dot Chapter 6 - Nitrogen-Vacancy Center Chapter 7 - Bose–Einstein Condensate Chapter 8 - Important Concepts in Development of Quantum Computing Chapter 9 - Qubit Chapter 10 - Timeline of Quantum Computing


<<page 4 of 101>>

Chapter- 1

Introduction to Quantum Computing

<<image of bloch spere from wikipedia>> The Bloch sphere is a representation of a qubit, the fundamental building block of quantum computers.

A Quantum computing is also called quantum computer is a device for computation that makes direct use of quantum mechanical phenomena, such as superposition and entanglement, to perform operations on data. Quantum computers are different from traditional computers based on transistors. The basic principle behind quantum computation is that quantum properties can be used to represent data and perform operations on these data. A theoretical model is the quantum Turing machine, also known as the universal quantum computer.

Although quantum computing is still in its infancy, experiments have been carried out in which quantum computational operations were executed on a very small number of qubits (quantum bit). Both practical and theoretical research continues, and many national government and military funding agencies support quantum computing research to develop quantum computers for both civilian and national security purposes, such as cryptanalysis.

If large-scale quantum computers can be built, they will be able to solve certain problems much faster than any current classical computers (for example Shor's algorithm). Quantum computers do not allow the computation of functions that are not theoretically computable by classical computers, i.e. they do not alter the Church– Turing thesis. The gain is only in efficiency.

Basis (bold heading at bottom of page)


<<page 5 of 101>>

A classical computer has a memory made up of bits, where each bit represents either a one or a zero. A quantum computer maintains a sequence of qubits. A single qubit can represent a one, a zero, or, crucially, any quantum superposition of these; moreover, a pair of qubits can be in any quantum superposition of 4 states, and three qubits in any superposition of 8. In general a quantum computer with n qubits can be in an arbitrary superposition of up to 2n different states simultaneously (this compares to a normal computer that can only be in one of these 2n states at any one time). A quantum computer operates by manipulating those qubits with a fixed sequence of quantum logic gates. The sequence of gates to be applied is called a quantum algorithm. An example of an implementation of qubits for a quantum computer could start with the use of particles with two spin states: "down" and "up" (typically written and , or and ). <<<<<the same symbols as wikipedia uses, but it didn't transfer here>>>> But in fact any system possessing an observable quantity A which is conserved under time evolution and such that A has at least two discrete and sufficiently spaced consecutive eigenvalues, is a suitable candidate for implementing a qubit. This is true because any such system can be mapped onto an effective spin- 1/2 system.

<<same picture that wikipedia has about qubits>>

Bits vs. qubits Qubits are made up of controlled particles and the means of control (e.g. devices that trap particles and switch them from one state to another). Consider first a classical computer that operates on a three-bit register. The state of the computer at any time is a probability distribution over the 23 = 8 different three-bit strings 000, 001, 010, 011, 100, 101, 110, 111. If it is a deterministic computer, then it is in exactly one of these states with probability 1. However, if it is a probabilistic computer, then there is a possibility of it being in any one of a number of different states. We can describe this probabilistic state by eight nonnegative numbers a,b,c,d,e,f,g,h (where a = probability computer is in state 000, b = probability computer is in state 001, etc.). There is a restriction that these probabilities sum to 1.


so its a very similar copy, with the images copied exactly. The first chapter is mainly a copy and paste of this article. Comparing the wikipedia articles that correspond to the titles of the other chapter looks to give the same result: copying. not everything was copied exactly, but the majority was that i noticed while quickly looking through it. Perhaps somebody copied wikipedia articles into microsoft word, put them together, and used an online image for the cover (the cover image is the first result when searching for "dna keyboard" on google. And its available in many sizes according to google. Interesting. 146.151.54.140 (talk) 00:32, 5 October 2011 (UTC)same person as before

Here's Wikipedia's policy on the issue: [16]. I suspect copyright laws in India aren't so strong and thus pursuing the issue would be a waste of time. Any comments? Skippydo (talk) 19:20, 5 October 2011 (UTC)

Is there any good information on what an "analog" (non-qubit?) quantum computer, or "quantum real computer" would entail being, exactly? Could there be mention of such in this article if it is meaningful. I assume it may entail quantum waves rather than quantum 'digital' particles? 216.227.117.152 (talk) 08:04, 29 September 2011 (UTC)

A quantum computer is analog (and probabilistic) by definition. Skippydo (talk) 19:59, 29 September 2011 (UTC)
Is not the definition of the use of bits, whether qubits or not, digital by definition? 216.227.117.152 (talk) 00:03, 30 September 2011 (UTC)
Digital means two states. A bit, by definition, has only two states. A qubit has an infinite number of states. Skippydo (talk) 17:19, 30 September 2011 (UTC)
Binary means two objective states, but doesn't digital mean relational duality, literally: "representing data as a series of numerical values" even an infinite series? Qubits ultimately normalize objectively into being only one state in relation to one other state as output? I think of the difference as particle (localized: qubit) or wave (nonlocal: analog) 216.227.117.152 (talk) 23:13, 30 September 2011 (UTC)

Church-Turing thesis violated or not?

In this wikipedia article it is claimed that the quantum turing machine does not violate the Church-Turing thesis. However, in this survey: http://www.cs.berkeley.edu/~vazirani/bv.ps the authors claim to give the first formal evidence that the (modern) Church-Turing thesis is violated by the quantum turing machine. In my opinion, this should be investigated and straightened out. Cbswe (talk) 14:21, 17 December 2011 (UTC)

Quantum computers don't violate the Church-Turing thesis, though they might violate a stronger version of thesis, which is sometimes called the extended or physical CTT. --Robin (talk) 05:45, 21 December 2011 (UTC)

(a,b,c,d,e,f,g,h) - Probabilities and vector coordinates

In the 'Bits vs. qubits' section the letter set is first used as a set of probabilities and in the next sentence as a set of vector coefficients.

"We can describe this probabilistic state by eight nonnegative numbers a,b,c,d,e,f,g,h (where a = probability computer is in state 000, b = probability computer is in state 001, etc.). There is a restriction that these probabilities sum to 1. The state of a three-qubit quantum computer is similarly described by an eight-dimensional vector (a,b,c,d,e,f,g,h), called a ket. However, instead of adding to one, the sum of the squares of the coefficient magnitudes, , must equal one."

As pointed out in the text, in the latter case the probabilities are equal to the squares of the coefficients a,b,c..., which is confusing. A different letter sets should be chosen for the probabilities and the vector coefficients, e.g. capital letters , where for the probabilities. — Preceding unsigned comment added by Zinger0 (talkcontribs) 20:07, 27 December 2011 (UTC)

In the English Quantum computer article it says close to the end: 'It has been speculated that theories of quantum gravity, such as M-theory or loop quantum gravity, may allow even faster computers to be built.' We want to know where one can find more information regarding this topic. Perhaps the author can share what he/she thinks of computers that may differ from the quantum computer building principle we see today? Vachapaha (talk) 09:10, 19 May 2012 (UTC)

Quantum Isolation Quantum Computer References

Why are there not citations to the prior arts of Quantum Isolation based Classical Scale Quantum Computers? It seems there should be some references to this approach, given the description of Schrodinger's Cat Systems, where a radioactive source with an indefinite quantum decay time, is used to make one probabilistic branch on the outcome of a cat placed in that quantum system, that is projected into one state again, when a larger system Quantum Grounds the system. For Quantum Isolation based Classical Scale Quantum Computers, a Classical Scale Computer is placed into Quantum Isolation, like a Spaced Based Orbiting Computer Satellite, and Radioactive Source decays all of indefinite quantum time are used to produce computer branching data, that places a classical scale computer system into an Operational Superposition of States, taking every potential branch from the greater universe system perspective, and after executing its code, the satellite broadcasts its branch record, with a delay proportional to the criteria of result utility, causing Quantum Grounding of the Computer's Superposition. Conversely, without such references, makes the implication that the concept of Schrodinger's Cat Systems is often over applied, with a lack of empirical pragmatic results in any real world application, like Quantum Isolation based Classical Scale Computers, where radioactive sources can place the Classical Scale Computer into two or infinite number of Superposition States, reducible to the optimal solution, like the ubiquitous Schrodinger's Cat Concept. Ref. SET236765732714909171744543750. 76.167.47.5 (talk) 17:01, 22 June 2012 (UTC)

Needs discussion of computability in addition to merely computational complexity

Here I'll use the term "more powerful" in terms not of efficiency but in terms of being able to solve at all a larger class of problems than the "less powerful". The article rightly points out that a quantum computer is no more powerful than a universal Turing machine. However, this section needs expansion to address the issue of whether a physically realizable quantum computer can be more powerful than a physically realizable non-quantum computer. This is not covered by the TM note, since a non-quantum computer is less powerful than a TM, given that it has a finite tape; i.e. it is not more powerful than a deterministic linearly bounded automaton. Now, it would be great if someone with knowledge of this area can comment on whether a quantum computer is more powerful than a non-quantum one in this sense; equivalently, does quantum computation bring with it the ability to recognize formal grammars that are not recognizable by a deterministic LBA? A possible second point to address is whether my question is related to the first LBA problem (which it would be if a quantum computer is equivalent to a non-deterministic LBA).

A quick note about justifying the assumption that a physically realizable computer is less powerful than a TM: this follows from the following: 1) the Bekensten bound, which derives from thermodynamics and quantum theory and puts a hard limit on the maximum number of distinguishable quantum states in a finite spatial extent containing finite mass-energy (see the eponymous wiki article), 2) relativity, which makes the limit on the size of an effectively integrated physical object (in the sense that its sub-components can communicate with each other in finite time) to be its light cone (thus limiting the spatial term in the Bekenstein bound), 3) said light cone being limited in spacelike extent as it is limited in timelike extent by quantum uncertainty, which guarantees that as time goes to infinity, the probability that any mechanism for timing/regulating the computation will fail goes to certainty, and 4) accelerating expansion of the universe, which limits the amount of mass-energy that can be ultimately contained in a computer to no more than the computer's original Hubble volume (thus limiting the mass-energy term in the Bekenstein bound). So we not only have a limit on information density (no arbitrary precision real numbers) preventing us from implementing super-Turing computation, but we also have a limit on space and time, so even countable infinity is not accessible for computation, so we cannot implement an infinite-tape TM. ThVa (talk) 22:34, 23 June 2012 (UTC)

Since a quantum computer can be simulated probabilistically with exponential overhead, I'm not sure anything interesting can be said about computability. Although I do recall some claims that quantum computers could solve certain problems that classical computers could not. However, these claims are very old and have since been debunked, if I recall correctly. Skippydo (talk) 23:56, 23 June 2012 (UTC)


Quantum Computer "Carrying Capacity" can be considered in the domain of results that can be achieved in a time window of computation as an [energy/time] analogy which is [power], of [ComputationSpace/time] as [ComputationPower]. A "Schrodinger's Cat" quantum computer that superposition-branches according to a contained radioactive source would cover an exponential result space of branches^branchings. For an Ideal Quantum Computer Traveling Salesman problem, from a start point, a Quantum Computer could branch to every possible remaining next state and branch to every possible remaining next state until the entire space is covered, and then the shortest path is found, which for 100 locations whittles 99*98*97*...*3*2*1 computation path superpositions to one result. A serial computer would have to explore 99*98*97*...*3*2*1 paths serially, i.e. one at a time. For real world applications with a periodic stream of choices, an Ideal Quantum Computer would explore a space of choices^periods, like 10^365 spaces, with more processing paths than particles in the universe. But we see, the limitation of even Ideal Quantum Computers, where there is only one idealised optimised result. For there to be some free will variation in such choice stream systems, one could utilize Parallel Idealised Quantum Computers, where several Ideal Quantum Computers can analyze variations on each space to produce their own optimal answer. But we can see this can be absorbed into a single Ideal Quantum Computer, where the variations of individual choices can be subsumed into additional branches on top of the base stream of periodic choices, to explore a space of (choices*variations)^periods, like (10*5!)^365. In all cases, any Quantum Computer, by a model of Schrodinger's Cat like Superpositions, whether a single device element or Classical Computer, as a "Cat", is computationally enhanced about an envelope of the branching of superposition power, ideally, and what can be computed is found about the number of choices, the degrees of freedom, contained in what is being computed. There may be reduction from Ideal, based on how many superpositions might exist in the device, perhaps an approximate of some [DeviceParticleCount]*[SuperpositionCapacity / ParticleCount] carrying capacity density times the device size. Ref. SET236765732714909171744543750. 76.167.47.5 (talk) 22:47, 27 June 2012 (UTC)
And something missing from computability about John von Neumann subsystem reiterations, is details of Heirarchical and equal level Quantum Circuits, with the descriptions and unit analysis of Kirchoffs Circuit Laws for Quntum Superposition Flows in discretised element properties, like Quantum Computer Burn In processes to decay entanglement capacitance, superposition density as afforementioned, and quantum grounding pathway resistance, like "opening the box" and the "Schrodinger's Cat" superposition reduction in quantum grounding, along with any description of quantum computer induction properties if currents of quantum superposition flow form a quantum superposition current magnetic field like property, and the subproperties created when approximating quantum computer elements, like quantum steroidal properties to me minimised for such quantum computer elements. Altogether, aGreed, many elements of description are utterly lacking in proper Quantum Computer Description over 80 years of consideration, and as many years of circuit analysis since about Kirchoff and before. Ref. SET236765732714909171744543750. 76.167.47.5 (talk) 17:57, 28 June 2012 (UTC)
The issues about Quantum Computer Design appears to lie about the approach, where the Computability topic, given the attempt to build Quantum Computers from discrete quantum devices, along with error crrection elements required each level, as usually described, is questionable, as picking the wrong granularity, closer to science experiment versus quantum computer design proper. If the Schrodinger's Cat "story" means anything to this subject, then Quantum Computers seem to best be approached by putting a Computer built from Classical Scale Current Organization Based Devices, where the Computer ("Cat") itself is Classical Scale, and a radioactive source in the "box" with the Computer, allows branching according to the radioactive signal. From the outside, the Computer "cat" enters into branches of superpositions that are collapsed when the outside observer "looks inside the box". Instead of a dead or alive cat, one has a Computer, for two radioactive multistate branches, with branch.1,1, branch.1,2, ... branch.1,n, branch.2,1, branch.2,2, ... branch.2,n, ... ... branch.m,1, branch.m,2, ... branch.m,n, that are reduced to a single state branch.M,N, or generally, branch,M,N,O,P,Q,R,S ... . It alleviates the many technical problems of recent Quantum Computer design, and simplifies what is computable to what Classical Scale Computers could do, if decision or option branches are fully explored to the optimal solution (e.g. traveling salesman problem, multidimensional optimisation or fitting problems, branching choice problem optimisation). And so while an ideal Quantum Computer of this type achieves algorithm*(choices^layers) computation superpositions, with one optimal solution, with a "dilation" of the algorithm about choices and layers in the baseband classical computer code. AND if such box design approaches are not functional in any way whatsoever, then the classic "story" about Schrodinger's Cat has little relevance to Classical Scale Computers, and only applies to the limited computation carrying capacity of the very few "microscopic" element devices, and has not much future on the scale of a Schrodinger's Cat type "Classical Computer in a superposition box". Ref. SET236765732714909171744543750.

Bits vs. qubits image is misleading

The image File:Quantum_computer.svg provides an incomplete, misleading mental model for a collection of qubits. In particular, it has no way of representing quantum entanglement (e.g. there is no way to represent the state |0> + |4> using this pictorial representation). Garrison (talk) 18:17, 4 September 2012 (UTC)

Wrong use of Ref. 19. No discussion of critical opinions on possibility of QC

I am surprised by the absolutely incorrect and misleading use of Ref. 19: "For a 1000-bit number, this implies a need for about 104 qubits without error correction.[19]" First, it is not true, the correct estimate can be found in many books published previously to Ref. 19. Second, there is no such statement in Ref. 19. And third, and most important - as can be seen from the title "Is fault-tolerant quantum computation really possible?" - Ref. 19 contains a critical discussion of the existing principles of quantum error correction and the foundations of the threshold theorem. To my knowledge, the presented arguments were never refuted, but rather simply ignored. It would be only fair to present a critical point of view in this article (instead of advertising D-wave achievements, which are quite doubtful) — Preceding unsigned comment added by Karamzine (talkcontribs) 17:38, 6 September 2012 (UTC)

April 2012 breaktrough

Not sure whether this: http://www.gatech.edu/newsroom/release.html?nid=125111 has allready been mentioned in the developments section 91.182.142.202 (talk) 11:39, 1 October 2012 (UTC)

"Require" applied to binary

The statement "Whereas digital computers require data to be encoded into binary digits" is not true. 82.38.204.141 (talk) 17:31, 9 October 2012 (UTC)

The statement would be better written as: "Whereas none-quantum computers are defined by their data being encoded into binary digits (bits), quantum computation is defined by its data being encoded in bits that contain many states (not just two). This is usually achieved by utilising quantum properties to represent data and perform operations on these data." 82.38.204.141 (talk) 17:31, 9 October 2012 (UTC)

They do not "require" data to be encoded in binary. It is a design decision that they are this way. In other words, modern computers are encoded in binary by choice and not because of some fundamental limitation on their part. Binary just happens to be the most convenient way of working with and storing data that we currently know of. It is possible to build components on the scale of a transistor (or larger) that store more than two states per bit. Nobody does this though because the research on the advantages of doing so is still in its infancy. 82.38.204.141 (talk) 17:31, 9 October 2012 (UTC)

The future of the computer is quantum computer?

I think so, if it get smaller and smaller the transistor, it will be quantum computer.180.194.251.97 (talk) 00:10, 3 January 2013 (UTC)

This article is like

bullshit. — Preceding unsigned comment added by 109.60.126.42 (talk) 00:30, 3 January 2014 (UTC)

Quantum computers are not slower in all cases

the phrase "and thus far quantum computers have yet to solve a problem faster than a classical computer." is wrong.

http://www.gizmag.com/d-wave-quantum-computer-supercomputer-ranking/27476/

talking about D-Wave, a 512 bit quantum computer (just over 400 functional bits) " On the largest problem sizes tested, the V5 chip found optimal solutions in less than half a second, while the best classical software solver required 30 minutes to find those same solutions. This makes the D-Wave computer over 3,600 times faster than the classical computer in these tests." — Preceding unsigned comment added by 115.64.210.168 (talk) 12:41, 14 May 2013 (UTC)

Many of the claims of the D-Wave's power are overstated. See http://www.scottaaronson.com/blog/?p=1400 --24.12.205.104 (talk) 00:47, 6 January 2014 (UTC)

Rotistor - The Entangled Rotating at various spins Memristors

The "ROTISTOR" is a term introduced by PhD. professor Veiler Sword at July 14, 2014. It is a non atomic quantum processor. It is a series of entangled memristors, that rotate by switching on and of. Each rotating entangled memristor of the overall rotistor rotates at various rpms. We let the rotistor render data at varius rpm combination controlled by noise and hum numeric generators, for simple tasks we don't use hum/noise ordered spin but algorithms. (hum and noise here are not sounds but numeric diffraction/diffusion modes) Then we average the optimum solution (or solutions) by a classic CPU and an opperating number theory program.


Why is it better that atomic quantum computers? It is cheep, it works at non-extreme temperatures, easier to built, you can entangle as many rotating components you wish, if you want more accuracy you can entangle rotating memristor components with more available rotational angles, it is cheep to maintain-does not easily break.


we all know that quantum computers are atomic entangled spin apparatuses, but requier very low temperatures and that is not practical for massive productions.

we know also that quantum computers are averaging data machines, so they run very vast and average the optimum option.

we can build mechanical and electrical non atomic quantum computers that allow entangled spin rotations, and still we will have to average running the apparatus for 500msec to average many million probable solution to select the optimum.

today we have the math and the know how.

Also, we all know about low temperature hypermagnetic levitation. We can mimic that at normal temperatures, with to classic electromagnets, but in order the system does not fail at the sides we have to create a complex magnetic grid, that even inverses at some points the field polarization in order the system does not fail at one side.


Well, in my lab I have mixed both. Quantum non atomic computing and rotational systems, but we can build rotistors, tiny electric parts like entangled memristors that rotate at different speeds - each component, we run the rotistor for 500 msec to average and extract the optimum solution.

Of course the old rotistors are not so fast, so it might take 3 seconds to average the optimum solition, but a well build one is way faster.

The Koch family can support that effort!

We need practical solutions with tiny rotistors so billions of people can buy a cheep great quantum computer and not like today, that we have so few atomic quantum computers that allow a tiny only number of entangled rotating components.

The more it costs a design, the sillier the builder, and the most boring the science behind.

We need PRACTICAL applications available for EVERYONE and HUGE computational power to subserve everyday tasks and not theoretical only.

The "correct" entangled answers allow more current, higher values, wider dynamic ranges.

Bloch CPU

A Bloch CPU or "Bloch entangled transistor sphere [BETS]" is not an actual quantum computer, but it can be used as a "control unit" of an actual computer, it is zillion times slower than an analogue qubit, also the BETS does not allow all answers, thus you have to run it more times to average a result closer to the actual value. It either is an actual transistor sphere [having transistors on it's surface], but more commonly two entangled circles represent all posible connections of the Bloch sphere. Many entangled BETS (consisting of a multiple of two as entangled set-members number) consist the Bloch CPU. It can be used in a non professional way for example as a music, visual effects or video games secondary CPU, when we want a fast intricate and realistic result, but we do not have to run all the suggested times to get the actual average of the wavefunction collapse (non optimum averaging settings NOAS). It works in classic CPU temperatures, it's production is cheap (the classic CPU production techniques are required), it is mass production friendly. A BETS set of 32 entangled spheres when used at NOAS[not fully averaged to save time] mode, can mimic for example realistically many guitar amps and pedals. A classic CPU always is required to control the BETS averaging. For professional or academic use, when we demand a thorough averaging, it can only be used as a secondary control, test or filter of an actual quantum computer, because BETS are slow and not built to replace an actual quantum computer. Noise generators with controllers can provide faster inputs than a program controlled by a classic CPU, thus a noise generator unit NGU is required to avoid CPU overload — Preceding unsigned comment added by 2.84.219.136 (talk) 21:59, 19 May 2015 (UTC)

Probabilistic Computing - Probabilistic CPU - PCPU

A probabilistic CPU, is not an actual quantum computer, but a "Markov chain" CPU. Theoretically a P-CPU can perform identically with a Q-CPU but incredibly slower (zillion times slower). The future of quantum computing will include on the same motherboard both a P-CPU and a Q-CPU. The P-CPU will set the probabilistic tasks and a Q-CPU will perform the main calculation, then a classical CPU will distribute the data viscerally (towards motherboard components) and peripherally (towards peripherals). Elongated or complex tasks are better set by a P-CPU as a Q-CPU input and re-evaluative/parametrizational (IREP) component. Thus a functional mass production quantum computer (Classical-Quantum-Probabilistic - CQP computer) will include 3 CPU genres. The main article doesn't mention anything about CQP computing. — Preceding unsigned comment added by 2.84.219.136 (talk) 09:38, 14 May 2015 (UTC)

Quantum CPU with Carbon nanotubes (CNTs) Probabilistic Entanglement Channels (PECs)

An experimental stage of up to 4 entangled particles has been tested. At the final stage it is supposed to be able to entangle up to some billion particles for a personal device, and some septillion or more for quantum servers. That will take at least 300 years from now according to nowadays awareness. This article is not analytical enough about quantum CPUs built with Probabilistic Entanglement Channels (PECs), and it's a crucial mistake, because all quantum computing companies punctuate PECs' importance. Silicon tubes instead of carbon-based, are more suitable for quantum computing (for biological purposes, multicarbons are prefered) — Preceding unsigned comment added by 2.84.213.154 (talk) 11:42, 12 May 2015 (UTC)

Quantum supercomputer

Is it really the case that a quantum computer is "also known as a quantum supercomputer"? I've never seen that usage before & suspect it should be deleted. --24.12.205.104 (talk) 00:48, 6 January 2014 (UTC)

Announcing new results

/(

"..(-2,334)‎ . .(Sorry, Wikipedia is not the place to announce new results. See WP:OR) .." - and wghere, WHERE is such place, for new ideas, announcing and discuss them, developing them more, and .."brainstorming",maybe ?, or just constructive discussion.. yap, according that "WP:OR" document, if we all would strictly managed by it, it would be means, - all advancing, all scientific and technical development would be almost stopped, in fact.. bcos no one would know about any new ideas, new results, new.. anything.. :/(

\( — Preceding unsigned comment added by Martin Hovorka (talkcontribs) 19:30, 2 September 2013 (UTC)

deleting of new idea announcing

/(

hmm... not even just announce,a bit, any new idea (imho quite promising, relevant, and ..perspective.., this my multi-Sieves/Sifthering Q.C. Idea) ..even nor get,make a little advertisement,and spreading-out, for this new,constructive,fresh-brigth concept.. :\( (..but it is, and always was, my idea, trying make Q.Computing with this concept of sieve, /sifter approach,/concept(sieving) — Preceding unsigned comment added by Martin Hovorka (talkcontribs) 20:57, 2 September 2013 (UTC)

It's your fault for announcing the idea here. Doesn't everyone know by now that, if you want to patent an invention, you must do so within one year of publication. Under the new US patent law, publication may make it impossible to patent it at all. (And, yes, posting it on Wikipedia constitutes "publication".) — Arthur Rubin (talk) 21:07, 2 September 2013 (UTC)

other wiki for discussing quantum ideas

There is a common misconception that people who support the WP:OR policy are people who don't want to see any original research on any wiki.

Actually, some of us *do* want to see original research on a wiki -- but on an appropriate wiki where such research is on-topic, not just any random wiki.

There are over 10,000 English-language wiki. Pretty much any topic you can think of is on-topic at at least one of them. In particular, quantum computing seems to be on topic at several wiki including http://quantiki.org/ , Wikia: 3dids, http://quantum.cs.washington.edu/ , http://wiki.qcraft.org/ , http://c2.com/cgi/wiki?QuantumComputing , http://twoqubits.wikidot.com/ , http://chemwiki.ucdavis.edu/Physical_Chemistry/Quantum_Mechanics , http://www.physics.thetangentbundle.net/wiki/Quantum_mechanics , etc.

--DavidCary (talk) 04:30, 15 January 2014 (UTC)

Contradictory statements regarding space?

The introduction suggests that simulating an n-bit quantum computer on a classical TM requires 2^n discrete states. But doesn't this contradict the later statement that QBP is a subset of PSPACE? (Erniecohen (talk) 15:50, 14 October 2012 (UTC))

It's just an example of how a classical computer could simulate a quantum computer. It doesn't mean that it's the only way to do so. --Robin (talk) 15:55, 14 October 2012 (UTC)
The problem is that the way that it is written, it strongly implies that quantum computing provides a space advantage, which is just false, so the 2^500 crap should just be removed. The relevant connection between the models is just that a classical TM can simulate a quantum computer with a polynomial blowup in space, but is strongly believed to require an exponential blowup in time. — Preceding unsigned comment added by Erniecohen (talkcontribs) 16:04, 14 October 2012 (UTC)
I wasn't aware of this. Do you have a reference? Skippydo (talk) 19:27, 14 October 2012 (UTC)

Has this been resolved? It seems odd to me in any case to say that 2500 complex values is equivalent to 2501 bits (does this mean that there are only four complex values available)? W. P. Uzer (talk) 12:27, 2 February 2014 (UTC)

Given these two objections, I've commented out the sentence in question for the moment. W. P. Uzer (talk) 08:50, 4 February 2014 (UTC)

misc suggestions / solving chess - programming the universe - more intuitive intro

I think it would be great if the article mentioned solving chess as something that quantum computing would allow.

I also think Seth Lloyd's book "Programming the Universe" should be referenced somewhere.

The first sentence I think is too technical. It tells precisely what the term means in physics, it doesn't give any layman sense as to what quantum computing is. It almost sounds like: "Quantum computing is a process where computers use quantum systems to do computing". A layman will want to know something basic immediately like the fact that quantum computers are extremely fast and will soon be replacing standard computers.

...My 2 cents. Squish7 (talk) 00:40, 26 March 2014 (UTC)

Quantum games

A collection of IPs is adding an announcement of "the first quantum computer game". Even if this were sourced to a reliable source (it's not at all in the reference specified, which is a blog [not even a blog entry], and all I can find is a blog entry pointing to the announcement by the creator), would it be appropriate for the article? — Arthur Rubin (talk) 05:49, 21 May 2014 (UTC)

The announcement by the alleged creator is not a reliable source, either. — Arthur Rubin (talk) 10:39, 24 May 2014 (UTC)

Page title should be "Quantum computing"

To my sensibility, this page title is all wrong, or maybe not entirely - simply we have to create a more general page about more "general aspects and applications of quantum computing" like some quantum transistors that work partially as "quantum computers" but are cheaper to produce.

Computer science is no more about computers than astronomy is about telescopes.Edsger Dijkstra

Additionally, Google

 "quantum computing" -"quantum computer"

gives 2,440,000 results

 -"quantum computing" "quantum computer" 

gives 961,000 results, so one appears without the other significantly more often. — MaxEnt 05:10, 5 May 2014 (UTC)

 DoneRuud 18:01, 21 December 2014 (UTC)

strongly need th consider of change the title of this wiki-page from Q. "Computer" to "COMPUTING"

can be title of whole wiki-page re-turned to be Quantum COMPUTING, as it was before ? as "Computing" is far more fitting, more overal term for this wiki page´ topic, as just 1 concrete THE "computer") — Preceding unsigned comment added by 78.99.236.255 (talk) 17:49, 23 July 2014 (UTC)

 DoneRuud 18:01, 21 December 2014 (UTC)

Invented When?

You should discuss when Quantum computing was invented and the purpose of Quantum Computing at the time.

2601:C:5600:27B:C92A:EEBE:6758:C647 (talk) 17:52, 18 January 2015 (UTC)Johnnie DeHuff

The basic idea occurred as early as 1965, in Feynman Lectures on Physics,vol. 3 when Feynman mentioned that it was frequently faster to do the physical experiment than to do the mathematical computation. In other words, use a physics experiment as an analog computer. --Ancheta Wis   (talk | contribs) 17:56, 1 February 2015 (UTC)

Minthreadsleft

Why is it 10? If we're going to archive, 5, or possibly 2, would be better. I already archived one thread which was (potentially) about the subject, rather than about the article. — Arthur Rubin (talk) 22:52, 4 May 2015 (UTC)

Changed to 5. I also "archived" 3 sections about alleged simulators of quantum computers, with no credible source. — Arthur Rubin (talk) 12:30, 31 May 2015 (UTC)

Three-bit Example?

It would be very helpful to see an example of a calculation or problem solved at the three bit level. Is there such a thing? JFistere (talk) 03:11, 15 October 2015 (UTC)