What is quantum computing?
Whoa, let’s slow it down! On a normal computer, bits can only be either a “1” or a “0.” On a quantum computer, bits can be both, and can be fractions as well. Picture your normal grid of streets downtown in your city. You can only drive from one intersection to another, usually about four other intersections at a time can be reached from any given intersection. Now imagine “quantum intersections,” which can access any of eight other intersections from any other intersection, with no interruption of traffic flow.
That’s what quantum computing can do, is create shortcuts in computer logic. We’re already starting to see the limit in how much processing we can put through a microchip without melting it. Quantum computing will produce a leap of processing magnitude, allowing each microchip to do many times the work they do now.
Why does this threaten encryption?
The whole crux of encryption is that any code can be broken given enough time, but it would take billions of years to solve any code through trial and error – also called “brute force.” But a leap of many times the magnitude in processing speed could make automated code-breakers so good that even the strongest encryption would be trivial to crack.
And of course, cryptocurrencies such as Bitcoin also rely on encryption strength. Since the currency relies on salving math problems that are so big, the only way to solve them is to pitch random answers at them until one checks out as correct. There’s only so fast any computer can do that, making the mining of a new Bitcoin an event of value. If quantuum computing comes to the fore, all the remaining Bitcoins could be mined in a day, and then the encryption algorithms that keep them secure could also be cracked in a day.
What could save Bitcoin?
It’s really not a matter of BitCoin being doomed. The same technology that could break the encryption we have today could also create new, stronger, better encryption which cannot be broken as easily. Still, at the very least, whole new standards of computing software will have to be theorized and rewritten from scratch. It’s a lot of work, but it can be done.