Loading...

What is Quantum Computing

What is quantum computing, how does it work and how is it different from regular computing?

An introduction to quantum computing, its scope and uses

Jun 24, 2021    By Team YoungWonks *

In one of our earlier blogs, we dissected the subject of supercomputing (https://www.youngwonks.com/blog/What-is-a-Supercomputer-and-What-are-its-Types--Uses-and-Applications). Another buzzword in the modern tech world is quantum computing. Before we delve into this rather complex topic, let us look at what the term quantum means. 

Outside of the sciences - be it quantum physics, quantum encryption and quantum computing - we keep hearing the phrase ‘quantum leap’. So what does quantum mean here? Does it denote a huge leap? The word quantum has its roots in the Latin word quantus or quantity. In physics, it refers to a ‘discrete quantity of energy proportional in magnitude to the frequency of the radiation it represents.’ Discrete here means non continuous or distinct. In that sense, quantum refers to distinct or significant quantities. 

 

 

Quantum Computing Explained

In a nutshell, quantum computing is where certain algebraic methods - typically the same methods or those parallel to the ones applied in quantum mechanics are used to develop algorithms for computations. Quantum mechanics in turn refers to a fundamental theory in physics that delves into the description of the physical properties of nature at the scale of atoms and subatomic particles. A quantum computer then is a conceptual computer that can implement such algorithms. So quantum computers are essentially based on quantum bits, also known as qubits, which can be made from a single electron.

 

 

Let us try to understand quantum computing better by looking at regular computing. 

 

Quantum Computing versus Regular Computing 

A regular computer chip uses bits, which are essentially like switches in that they can either be on or off. The off position is represented by a 0 while the on position is denoted by a 1. Be it apps or websites, they are made up of millions of bits - all of which are combinations of 1s and 0s. What is interesting to note then is the fact that not everything in life can be captured in ones and zeroes; often there’s room for several uncertainties which are ‘discrete’ (aka distinct from each other) as well. This is where quantum computing comes in. Quantum computing follows the principles of quantum mechanics, a set of rules governing at a really small scale of atoms and subatomic particles.

 

Superposition

Instead of bits, quantum computers use qubits. And unlike bits which have to be on or off (aka 1 or 0), qubits can be both on and off at the same time or even between the two. In quantum computing parlance, this phenomenon is called superposition.  

A common example used to understand this is that of a spinning coin. We know that a coin that has been flipped will land either on heads or tails. But what about a coin that’s spinning? Before such a coin comes to a stop, it has chances of both - landing on heads and of landing on tails. Superposition then is like this spinning coin, which means it factors in uncertainty as well, something that ordinary computing does not do. 

So with quantum computers having the advantage of using ones, zeros and “superpositions” of ones and zeroes, they will be able to carry out a huge number of calculations at the same time. Thus, certain difficult tasks that have long been considered impossible to crack by conventional computers are now within the reach of quantum computers. 

Another obviously distinguishing factor is speed; quantum computing is way faster than ordinary computing. Indeed, by deploying superpositions, quantum computers can solve problems simultaneously and much faster. 

Take for instance, a problem such as figuring the way out of a maze. While a regular computer will explore each route one by one in order to zero in on the correct one, a quantum computer will be able to go down all these paths at the same time, thus enabling it to arrive at the solution - the way out - much faster. 

Entanglement 

Another key concept that forms the foundation of quantum computing is entanglement. In physics, quantum entanglement refers to a very strong correlation between quantum particles. This correlation is so strong that the two or more quantum particles can be said to be inextricably linked despite great distances. 

To understand this better with a simple analogy that lies outside the contexts of physics and computing, let’s go back to the coin example. Now imagine if there were not one but two coins being tossed. Typically, whether one coin lands on heads or tails has no impact on the result of the other coin toss. But in case of entanglement, both elements - no matter if they are physically distinct from each other - are connected or entangled. So in such a scenario, if one coin is landing on heads, the other one will also show heads and vice versa. 

In other words, thanks to the quantum states of superposition and entanglement, quantum computing allows for multiple and interconnected (hence, predictable) outcomes. This means that quantum computers have a shot at solving complex problems in a timeframe much shorter than our best computers are currently pegged to take up. Quantum computing is thus essentially about harnessing the laws of quantum mechanics to process information.

A quantum algorithm then is an algorithm running on a realistic model of quantum computation. Essentially, it refers to a step-by-step process, where each step is carried out on a quantum computer. Particularly useful are the quantum optimization algorithms which are, as their name suggests, quantum algorithms used to solve optimization problems. Mathematical optimization is when the best solution to a problem is being found from a set of possible solutions. In this context, a classical algorithm is a non-quantum one, referring to a sequence of steps performed on a classical computer.

 

Applications of Quantum Computing 

Armed with speed and efficiency, quantum computers show a lot of promise and in several areas. 

It is being said that quantum computers can speedtrack the development of Artificial Intelligence; Google is already using quantum computing to hone their autonomous driving software. 

Modelling chemical reactions is another area expected to benefit greatly from quantum computing. As of now, supercomputers are in the process of analysing the most basic molecules. But quantum computers are expected to be able to deal with the most complicated reactions.

Increased efficiency in solving complex equations is in turn supposed to pave the way for a slew of efficient products - be it better solar panels or increased capacity of batteries in electric cars.

It is also being said that quantum computing can be used in the field of medicine to spur the production of cheaper medicines and even find cures for diseases such as Alzheimer’s. Additionally, since quantum computers are very good at simulating systems with greater accuracy and in a shorter period of time, they have even been said to have helped in the administration of COVID-19 especially with regards to dealing with the spread, search for potential vaccines, and the advancement of therapeutics.

Since quantum computers will typically be of use in areas where large, uncertain complicated systems need to be simulated, industries such as financial markets and weather forecasts

Cryptography also stands to be affected by quantum computing. Bear in mind that currently, encryption systems work because decryption by breaking down large numbers into prime numbers is a rather tedious, expensive and impractical affair for your regular computer. But with quantum computers being able to solve such problems easily using quantum cryptography (aka the science of using quantum mechanical properties to do cryptographic tasks), our passwords - and as a result, our password-protected data - become vulnerable. In the event that quantum computers get around to decrypting this data, we would have to resort to staying afloat with - what else but - quantum encryption.

Many believe that quantum simulations will be among the key applications of quantum computing. Quantum simulations are what make possible the study of quantum systems since they are difficult to study in a lab and impossible to model with a supercomputer. So quantum simulators are special purpose devices created to offer insight into certain physics problems. A popular example of a universal quantum simulator is a quantum computer proposed by Yuri Manin in 1980 and Richard Feynman in 1982.

It is important to mention quantum information here. It is an interdisciplinary field involving quantum mechanics, computer science, information theory, philosophy and cryptography among others, and its main focus is extracting information from matter at the microscopic scale. The study of this field is also relevant to areas such as cognitive science, psychology and neuroscience.

 

Quantum Computers in Day to Day Lives

As mentioned above, these quantum machines hold within them a lot of potential, to say the least. IBM is already offering a quantum experience where users can sign and use a quantum computer; in fact, one can even play a card game with it.

The above applications notwithstanding, there are many situations where quantum computers are likely to be outdone by their classical counterparts. For starters, quantum computers are highly sensitive; this means that heat, electromagnetic and magnetic fields and collisions with air molecules can make a qubit lose its quantum properties. This process, known as quantum decoherence, can in turn crash the quantum system.

Being highly vulnerable to interference, especially electrical interference, these quantum computers will have to be stored in highly cold and sterile environments. This in turn means that quantum computers will - in all likelihood be used only remotely by industries such as banks, businesses and academia. And there is also a possibility that the computers of the future could be combination of both the classical and quantum varieties.

 

Quantum Supremacy and The Road Ahead

US theoretical physicist John Preskill has introduced the term quantum supremacy in order to refer to the hypothetical speedup advantage that quantum computing will offer as compared to classical computing in a certain field. Some practical quantum technologies are already coming up; be it highly effective sensors or actuators. And these devices will only help scientists navigate the nano-scale world with remarkable precision and sensitivity.

In October 2019, Google claimed that it had achieved quantum supremacy – the point at which a quantum computer can outperform a ordinary computer. A Sycamore processor created with Google AI Quantum is said to have done calculations more than 3,000,000 times as fast as those of Summit, considered to be the world's fastest computer.  This claim may have been disputed by rivals such as IBM but it goes to show that a lot of work is being done in the field of quantum computing research.

In December 2020, a group from University of Science and Technology of China (USTC) developed Jiuzhang, the first photonic quantum computer to attain quantum supremacy. They did so by successfully implementing a restricted model of non-universal quantum computation on 76 photons. The USTC team claims that a classical contemporary supercomputer will need a computational time of 600 million years to produce the number of samples their quantum processor can generate in a mere 20 seconds.

And yet, quantum computers are not going to pervade our day-to-day lives anytime soon. They are far from hitting the consumer market and this is because the number of qubits that the best quantum computers have right now is around 50. While that does make them quite powerful - each qubit added translates into an exponential increase in processing capacity, so it is without a doubt a lot of computing power -  these quantum computers also have really high error rates. So they’re powerful and yet not reliable, which is a far from ideal combination.

 

 

 

*Contributors: Written by Vidya Prabhu; Lead image by: Abhishek Aggarwal

This blog is presented to you by YoungWonks. The leading coding program for kids and teens.

YoungWonks offers instructor led one-on-one online classes and in-person classes with 4:1 student teacher ratio.

Sign up for a free trial class by filling out the form below:



By clicking the "Submit" button above, you agree to the privacy policy
help