***
Science and technology were making great strides at the end of the nineteenth century, to the point where we were beginning to discover problems with the reality we thought we lived in. Newtonian physics does a great job of describing what we see around us, but it turns out this is an illusion created by the scale at which we operate. It’s like thinking the earth is flat because it looks that way, but it only looks that way because we’re not big enough to see it; reality is in the eye of the beholder.
What we discovered as we looked closer with better
technology was that the universe isn’t a deterministic machine. The double slit
experiment caused great confusion because it looked like light was both a wave
and a particle. Rutherford’s gold foil experiment suggested that the recently
discovered atom was almost entirely empty space. Most of what you breathe in is
vacuum! The universe is much stranger than we first thought, and it isn’t
deterministic at all, but very much probabilistic. Einstein hated this ‘spooky
action at a distance’ quantum nonsense, but through the 20th Century
we’ve come to understand that this is how the universe works. Most people don’t know this because education
finds teaching science in a Newtonian way easier. Professor Brian Cox has a
good quote in his book, The
Quantum Universe: “It’s not Newton for big things and quantum for small
things, it’s quantum all the way.”
This emerging quantum awareness created the first quantum revolution. Once we recognized that quantum effects happen around us all the time, we started designing technology that made use of these newly discovered natural phenomena. If you think this is only for exotic university labs, you’re wrong. The flash memory that you’re likely reading this through depends on quantum tunnelling to work, as do lasers, MRIs and super conductors.
So, what’s all this talk about quantum computing and what
the heck does this have to do with cybersecurity? In the 1970s many researchers
started theorizing about quantum computing and Richard Feynman put it together
in the early 80s, then the race was on to build the theory. What’s the
difference between this and passive 20th Century quantum technology?
We’ve developed the technology and theory now to engineer quantum outcomes
rather than just using what nature gives us. As you might imagine, this is
incredibly difficult.
I had an intense chat with Dr. Shohini Ghose, the CTO of the
Quantum Algorithms
Institute at the end of our quantum cybersecurity readiness training day
this week in BC. She was (quite rightly) adamant that we can’t know quantum
details without observing them and when we observe them, we change them, but my
philosophy background has me thinking that I’m going to try anyway. An
unobserved universe is entirely probabilistic. It only becomes the reality we
see when we perceive it. It reminds me of the crying angels in my favourite Doctor Who episode.
This bakes most people’s noodles, but the math clearly indicates that in
measuring a photon’s location we can’t also know its velocity and direction –
that’s the uncertainty
principle in action. I’m probably wrong about all of that, but I’d rather
people take a swing at understanding this strangeness rather than being afraid
of being wrong.
Alright, we’re halfway through this thing and you haven’t
mentioned anything cyber once! If you think about the electronic systems we
use, they’re entirely Newtonian. They reduce information to ones and zeroes and
produce the kind of certainty we all like, but this is a low-resolution approach
that is about to hit its limit. We’re building transistors so small now that electrons
are tunnelling through the nanometer thick walls (atoms are mainly empty space,
remember?) between transistors, rendering future miniaturization impossible;
we’re nearing the limits of our Newtonian illusion. That means the end of
Moore’s Law! Panic in the disco!
Quantum computers don’t use electronics as a common base. A
quantum computer processor might be ionized particles, or photons, or nanotech engineered
superconductors, and those are just a few of the options. By isolating these
tiny pieces of the cosmos away from the chaos of creation and applying energy
to them in incredibly intricate ways, we can create probability engines that
use astonishing mathematics to calculate solutions to problems that linear
electronic machines could never touch, but unlike classic computers we need to
do this without observing the process or all is lost. Imagine if you had to
design the first microprocessors in the dark and you’re a fraction of the way
towards understanding how difficult it is to build a quantum computer, but it’s
happening!
We’re currently in what’s called the NISQ (noisy
intermediate scale quantum) computing stage. We’re still struggling with
applying just enough energy to get a particle to polarize how we want it to,
all while keeping the noise (heat, radiation) of reality out. That’s why you
see quantum computers in those big cylinders as a chandelier. The cylinders are
radiation shields and containers to cool everything down to near absolute zero
(gotta keep that thermal noise out), and the chandelier is to keep the electronic
noise of the control systems (old school electronics) away from the quantum
processor.
My favourite quote from the PhDs I’ve talked to is, “a
viable quantum computer is five years out. And if I’m wrong, it’s four years.”
What does that mean for ICT types? Quantum computers don’t do linear. When you
give them a problem, they leverage that state of being everywhere at once to
produce massively parallel computing outcomes completely foreign to what we’re
familiar with in our multi-core processors. Quantum algorithms are designed to
blackbox the calculation, so observation doesn’t spoil quantum processes and
then spit out answers as probabilities.
What does that mean for cybersecurity? Peter Shor came up
with an elegant idea in the mid-90s that uses a Quantum Fourier Transformation
to calculate the periodicity in prime number factoring. If you can calculate
the period of two large, factored primes (there is a repeating pattern), you
can reverse engineer those primes. In RSA encryption or anything else that uses
factoring you could calculate the private key and tear apart the encrypted
transport layer handshakes rendering secure internet traffic a thing of the
past. From there you could imitate banks or governments or simply decrypt
traffic without anyone knowing you’re there. You won’t see cybercriminals doing
this because the tech’s too tough, but nation states will, though you won’t see
them either because they will be quietly collecting all of that encrypted
online data Imitation Game
style. This process may already have begun with harvest now, decrypt later (HNDL).
There is much more to quantum technologies in cybersecurity
than the encryption panic though. Recent research suggests that instead of running
into limits with electron tunnelling in transistors, our new quantum 2.0
engineering could leverage this quantum effect to create Qtransistors
magnitudes smaller and much faster than what we have now. Cybersecurity will
have to integrate that technology as it evolves. Quantum communication is
another challenge. NIST
is making mathematical quantum resistant algorithms as I type this, but you
could also leverage quantum entanglement itself to create quantum key encryption.
China has an entire network of satellites testing these hack proof comms links
now. There could be quantum locked portions of the internet in 15 years where
high security traffic goes. Guess who is going to have to manage those secure networks.
If you’re in cybersecurity there is much more to quantum
than panicking about encryption. Anyone in the field would be well served by
digging in and researching this fascinating technological emergence. My
colleague, Louise Turner, and I presented at the Atlantic Security Convention
on this in April. Give
our presentation a look. There are lots of links to fascinating resources.
It’s time to free your mind, Neo.