Quantum Computing

Today we look to the future, focusing on how quantum mechanical phenomena can be harnessed to vastly increase our computing capabilities. Leaving aside quantum computing for a second – computing itself has pushed the human race forward phenomenal amounts. It’s often said what you know isn’t as important as what you know how to know – possessing the knowledge of how to acquire information is just as useful as the information itself. In reality you are only ever one step away from it. I view computing in the same way – sure, I may not be able to tell you the number pi to 1 million digits, but I know how to programme it and can read it out from there; so I basically do know the number, you just need to give me a second to obtain it.

I think a common misconception, and apologies if this is not the case, is to think that quantum computing somehow implies some bigger, better or faster computer. There are most certainly speed advantages but this is not the main motivation – these are all issues of performance of a computer which could apply to any computer. I have discussed previously (see here) how it may in theory be possible to simulate everything around you using a computer. In a sense we already can simulate both real world phenomena and figments of our imagination.

Image result for film special effects

It is possible I suppose using complex magnetic fields to actually film hundreds of bullets flying at Neo, before they gracefully stop before him but I think most people realize it is within the filmmakers financial interests to simulate a bullet and the desired trajectory. Now where does this all tie in with quantum computing? Well the big difference between a quantum computer and an ordinary computer is that an ordinary computer can never simulate a quantum computer – and since an ordinary computer can simulate everything we can see or generally experience that leaves a real gap in understanding them. It was never going to be easy with quantum in the name. What this fact teaches us however, is that there is no single complete explanation of how quantum computers work. We must therefore accept less rigorous explanations which is of course unnatural.

We are comfortable that a computer uses bits of information, and that we can assign a binary language to these bits to give a description of the state of the computer. The quantum computer on the other had uses quibits of information – the smallest quantum units. There is a fundamental difference in the scale of language we need to use in order to describe them: when we have 5 bits of information we only need two numbers to describe the state. When we have 5 quibits if information we need 32 – this is staggeringly different and is calculated as 2^5. The bottom line is that to describe n-quibits of information I need 2^n numbers to give the correct picture – this number quickly escalates to the point where at 300 quibits and we require more numbers than there are atoms in the world. Hopefully you can see why classical computers can never simulate quantum ones, the minutes are enormous!

The general working, or perhaps current understanding, of a quantum computer is that a quantum particle is suspended in a particle trap, poor little guy. These are the quibits, which require all of those different numbers to describe them – what we are actually talking about is a superposition of 2^n states of n quantum particles. Now in a quantum computer we have quantum gates (much like we have gates in an ordinary computer, for those who know computing), which basically allow us to do something to the quibits – usually just one but it is possible to involve more than one in the initial manipulation. When we talk of manipulation, it’s not the sort of witchcraft it sounds like it could just be shining light or lasers. What happens when we manipulate one of the quibits is that we impacting them all – this is hugely complex, since as we discussed we need a huge amount of numbers in order to describe them. We would not have been talking about changing 5 quibits, but rather the 32 different numbers needed to describe. It is as if we have a large deck of cards, we do something to one of them and the ordering of the whole deck changes. This is not how things work in computers.

Now the huge advantages come from the fact that an ordinary computer can process 1 combination of n bits at any one time – whereas the quantum computer is processing all 2^n combinations at any one time. So although we won’t ever be able to simulate a quantum computer within our own classical computers we should be able to do it the other way around which would allow some frighteningly amazing results. There are many “un-programmable” tasks which are far beyond conventional computers which will become programmable. This is powerful – as it isn’t just because we don’t understand how the computers work, it is actually that we do understand how they work and as a result we know they cannot perform certain tasks.

Whilst we have talked about this amazing complexity and seemingly impossible tasks, it is only fair to bring you back to 2016. The current consensus for the world record for largest quantum computer is very varied – but the range is 5-10 quibits. Construction quantum computers is really difficult and truly we don’t know how big we are going to be able to go, or if there will be some incredible machine at the end of it. As we know theory does not always yield the desire outcome in practice; the most complex calculation we can currently emulate is 21 = 3 x 7. At worst however, it seems like probable we will deeply enrich our understanding of the quantum universe which is not a bad place to be. At best, we may leapfrog the human race forward further than any paradigm shift before – so probably worth some attention.

Finally, my apologies to those who feel they are hearing a little too much of me! Mekhi is currently doing her thing at space conferences in India, but will be back online next week.

 

30 responses to “Quantum Computing

  1. I imagine the main practical problem is how to reliably set the quibit to a 0 or 1 state in memory and then read the state back. The speed of these input/output operations could limit the speed of the computer.

    (And does the quantum gate reliably store the information, or is it like Schroedinger’s Cat, where looking at it changes the value!) 🙂

    A quantum computer’s main advantage would be squeezing more data into a smaller space. Our current computers began with glass tube transistors, which became individual silicon transistors, which became millions of transistors photo reduced and light etched with photo sensitive chemicals in layers onto an integrated silicon chip. I presume the quantum gate would be much smaller than these, but wonder how they will be manufactured.

    Back in the old days there was talk of tri-state computing using -1, 0, +1 as negative charge, no charge, and positive charge. So there may also be ways to have more than two states with quarks.

    Liked by 2 people

    • Well the key thing is actually the quibits don’t have a 1 or a 0 state – the language is far richer than binary (in terms of the number of numbers required to explain it) leading to it’s distinct advantage. And the multiplication of states when we add more and more quibits is the really exciting thing for the future – we just need to work out how we can actually build the things! I totally agree that the construction is likely to be the biggest challenge – the theory is more sound than the engineering

      Liked by 1 person

    • Well I suppose the exciting thing is we don’t really know. They should be able to perform classical tasks impressively but I think the biggest changes will be with the intelligence we are able to programme into them. They will be able to read handwritten texts for example in the near future – but it is highly possible we can create an emotional intelligence in the machines.

      Liked by 1 person

  2. Pingback: Are quantum computers proof of multiple universes? | True Strange Library·

  3. Pingback: QUANTUM COMPUTING | vyagers·

  4. Pingback: Quantum Computing — Rationalising The Universe | Anonymous As One·

  5. Pingback: Quantum Computing — Rationalising The Universe – Invensta-Instant·

  6. Pingback: Quantum Computing — Rationalising The Universe – Intec 2020·

Leave a comment