Quantum computing is an area of computing focused on developing computer technology based on the principles of quantum theory (which explains the behavior of energy and material on the atomic and subatomic levels). ... Quantum computing, on the other hand, uses quantum bits or qubits.
Quantum computers perform calculations based on the probability of an object's state before it is measured - instead of just 1s or 0s - which means they have the potential to process exponentially more data compared to classical computers. ... A single state - such as on or off, up or down, 1 or 0 - is called a bit.
Particle physicist and quantum computing expert Sarah Malik says quantum computers have yet to reach their maximum potential and that now is the time to get involved.
We carry in our pockets now something that's way more powerful than that that computer we had back in the nineteen fifties and I think it was impossible to have predicted just how pervasive and ubiquitous they're going to become.
Pixabay.com
Malik describes Quantum Computing with the analogy of a coin. The fundamental unit of information used in a classical computer is a 'bit'. The 'bit' is a unit of information that can only take on the values zero and one. Whereas in a quantum computer, the fundamental unit of information in a 'quit', which can not only take on the values zero and one, but can also take on any number of linear combinations of zero and one.
If you think of this in terms of flipping a coin, with a classical computer, and you might get heads or tails. With a quantum computer, think of using the coin while it is still being tossed, you don't know until you collapse it whether it's a head or tail.
Quantum Computers have had and are still having major breakthroughs, they could eventually process data in ways we can't even comprehend today.
Listen to the full podcast episode here.
Commenti