Transcript
This transcript was autogenerated. To make changes, submit a PR.
Hello, everybody. I'm Sam Samuel from Cisco Systems. I'm here
to give you a talk on quantum communication challenges in scaling quantum
computers. This is a general talk. It'll cover a fair amount of ground.
It'll go from basic quantum computing and how
we can apply it in a variety of ways,
whether it has poses threats for the way we currently use
classical networks, and whether the same principles used in quantum computing
can be applied to quantum communication, and whether that would be a beneficial
direction to go in. And we eventually look at the problems that if we choose
to go that direction, we will face. And these, of course, give you an idea
of what the situation could be, or how the
ability or the interaction between quantum computing and quantum
communication can lead to it in the future
and give you some idea of what that would be. So it's
a longish talk, hopefully be interesting for you, and so we'll dive right in.
So the first question we really want to ask is, why is it that there's
a fair amount of interest in quantum computing today?
And I guess the best way of answering that right
now is to sort of look at the investment levels from
the venture community, investment community into quantum technologies of any
sort. And this chart here shows us that there's an increasing amount of
investment going into this particular area. The total investment is over a billion dollars now,
although this chart here is over two years old. So I'm assuming that
the total modes investment is well beyond a billion and a half dollars that we
managed to count in this particular chart.
And that investment is spread over three things. It's spread over quantum computing,
and it's spread over quantum software.
And then it also includes quantum communication and quantum
Internet type things. It seems natural that the way that
the investment is spread is kind of weighted towards
quantum computing, because without that, none of the other two really kind of make
sense. And it's kind of easy to see that without
a quantum computer having quantum software to run, then it
won't be at all useful. So my
interest is more to do with the quantum communication and quantum Internet side
of things. And this talk will look at that particular area and more
interest. But the point is, all aspects of quantum computing
or quantum technology are being investigated, invested in recently. And this
talk is trying to give you an idea of where we are in the state
of all of that, really. So with that in mind,
then, the first question we have to ask is, what is quantum
computing? Quantum computing is very different from classical computing.
In classical computing, we're quite used to dealing with bits
a bit is zero or one. In quantum computing,
we are dealing with a quantum bit, a qubit. And a
qubit is different from a classical bit in the sense that
it has the superposition of zero and one,
meaning that the same bit represents zero and one simultaneously.
And if we have a register of two
bits, and that register will then represent,
because of superposition, all possible combinations of
terms and ones in that register. So, in other words, it has
the entire gamut of all possibilities, these in one
throw, if you will. The other thing that quantum computing
uses an awful lot is entanglement. So,
entanglement, in this case is where two particles
are highly correlated, and it's very difficult to break apart
an individual state between two particles that are entangled.
So when it's difficult to break apart the state,
then that means that the states are pretty much highly correlated,
that we can't divide them apart.
So it becomes very difficult for us to look into a quantum system
and get a definite answer out of it. And entanglement
is the means behind which it makes it very difficult for us to go in
and inspect an individual particle. We have to sort of
look holistically at the overall system. Quantum computing
works off quantum circuits that are probabilistic in
nature. So, in contrast to classical networking,
classical networkings tend to be very network circuit,
sorry, quantum circuits, as opposed
CTO classical circuits.
Classical circuits tend to be very deterministic in nature,
that you apply the same computation to it, you get the same answer
in quantum circuits. That may not necessarily be the case.
The outcome is very probabilistic in nature.
So these three things,
superposition, entanglement, and probabilistic computation,
come together when we do quantum computing. And the final
figure in this particular slide shows that
a classical circuit takes a predefined path through
a computational, rather sequential, not predefined, because I
know you can do conditional statements inside a program.
But the point is that it's a very sequential way
that the program executes, whereas in a quantum
system, it's very different because of superposition,
we're able to explore the entirety of the state space of the
problem that the algorithm is addressing. And in so doing,
as we explore the entirety of the state space, we can then come to a
resolution of the answer quite quicker,
more effectively than we could do otherwise. And that's the
benefit that quantum computing brings. So how
can we apply that? And so the next few slides look at what
the possible implications are of having such a computer.
So a computer that's able to explore the
entirety of state space kind of changes, the way we
look at the potency of calculations that such a computer would have.
So this slide here has a table on it, and the
table kind of indicates the generations that a particular
technology would have. So operations zero, one through all the way to n.
We're quite used to measuring the potency of calculations in
a silicon based system, a classical based system, through Moore's law. So in
other words, we expect every 18 months or so for
a silicon based processor to double in its potency
over time. And that's what the top line of
this particular table shows.
The second line, the double exponential line, is showing
something called Nevin's law. So Nevin's law is
different from Moore's law in the sense that it's a double exponential rise in
potency. So it has a different sequence. So it's
not a case of doubling every generation. It's a case
of double doubling every generation. So rather than 1248,
we have 2416, 256. So we can
see that if we have a computer which has that kind
of potency or that kind of doubling law associated with it,
then it can be very potent, become very potent in
a short number of generations. These double exponential
law was discovered by a guy called Hartman.
Nevin and Hartman realized that in his experiments, in order to
prove the potency of the quantum computer that he had, he had
to go and check it against a classical computer. And these
classical computers were running out of potency far faster than he
expected. When he looked into it, it was a double exponential law.
Now, the interesting thing with that is
that if you have algorithms, such as factorization
algorithms, which can be optimized for quantum computers,
then you start to see the potential that this kind of computation
has versus classical computation. And if you have
a means of factorizing numbers very,
very quickly, then you have a means of actually breaking ciphers in classical networks,
which is what we're going to show in the next slide. So if we have
a quantum computer that has a double
exponential rise in its potency in the number
of qubits it's able to employ against a problem,
then one of those problems could be the factorization of
keys in a network. And so if we apply
that to the current cipher
algorithms that are in networks today, we can see that a
quantum computer has the potential to break the cipher
of certain key lengths. And this is what this chart is showing.
And it also shows that there comes a point when Nevin's
law starts to accelerate the potential
for a quantity computer to break the ciphering in current networks.
Now, this may seem a little bit
down in the sense that, oh, the world will collapse. That's not,
strictly speaking true, because, of course, the very clever people at NIST
have run competitions to ensure that we are slightly ahead of
the curve in terms of the new quantum ciphering,
or post quantum rtography, rather, that would be applied to combat these techniques.
In terms of computation. Though, if we go to the next slide,
we really have to realize that the certain kinds
of algorithms are more suited to breaking certain
kinds of key. So keys which are
PKI in nature are impacted by Shaw's algorithm keys, which are
more symmetric in nature. Symmetric ciphering that's going on cannot
be addressed as easily by Shore's algorithm. They can be addressed by a different algorithm
called Grover's algorithm. And these algorithms are not as effective
in breaking longer and longer keys.
And so that's what this chart shows. So, long story short,
as the potency of quantum computers increases,
then the likelihood of a quantum computer
being able to break ciphering of current networking
starts to go up. But that data isn't necessarily being
broken on the fly. It's because the data has been siphoned
off from the network and is stored somewhere where a bad actor
can take his time over trying to break the key if he has a quantum
computer at his disposal. So that's one
application of quantum computers.
But of course, we can apply quantum computing in different directions,
and there are other sorts of very complex
problems that need solving, and there are other np hard problems out there
that are in need of the attention of something like a quantum computer.
And so we would like to be able to
increase the potency of the computers that address that particular problem.
So it becomes natural to state the following obvious point,
that you can always make a more powerful computer by joining
more powerful computers together. And this is also true for quantum computing.
So if we're able to join quantum computers together to make
a larger quantum computer, then we can solve more complex problems
more effectively and probably quicker as well.
So I want to turn this talk really now onto how
we would do that, how we would try and join computers together. But before
we do that, it's probably worth exploring what we mean by this.
So the idea of having
a parallel execution path in a program has been out there for quite some
time. So back when I was at university in the late 1980s,
early 1990s, there was an awful lot of work going on
in parallel computing. So we were able to get an application,
break the application down into a set of parallel chunks, and then try and speed
up the computation by applying the parallelism to that particular application
or the algorithm that we're looking at, we want
to do it slightly differently for quantum computing.
And the reason for it is that not
all quantum computers are good at solving every single problem.
In other words, there's no such thing as a general quantum computer.
Yet these computers tend to be very specialized and look at
either annealing like problems or optimization like problems, or they're very
good at factorizing, but not necessarily a mixture of both.
Of course, over time, the situation may actually change.
But it's important for us to understand that not only as
we present an application or
a program to a network, we want to make sure that that application is broken
down and distributed to the appropriate quantum computer. So that's another
reason why we would like to have networking involved
in this kind of evolution.
So in order for us to communicate,
it's probably worth reexamining some of the principles that we
looked at in terms of quantum computing and apply them here a
little bit more. We obviously want to
ensure that a network is secure. So if we're able to communicate,
and we want to make sure that the network is itself not
able to be eavesdropped. Interestingly enough,
if we're working in terms of quantum state and entanglement,
it's impossible to clone an unknown
quantum state. In other words, we cannot copy quantum state if we don't know what
these quantum state is. So that kind of makes the
prospect of having a secure quantum
network in the future kind of appealing.
If we were able to take advantage of this particular theorem, the no cloning theorem,
communication would also require entanglement. So we mentioned
entanglement in the past, very briefly, in passing.
It's the same entanglement here. And this is where essentially
two particles have overlapping
wave functions. And as a result of the overlapping nature of the
wave functions, then these is impossible for us
to distinguish the individual state of each particle,
and therefore the states are entangled. These interesting thing behind that entanglement
is that, and we probably heard this
in the popular scientific press, as Einstein's spooky action in these distance is
that if we apply or
we influence one of the entangled particles,
the other entangled particle almost responds immediately. And so you
get something, a situation like this, whereas we apply a force on one,
the other one responds. Now, at this point,
people tend to get a bit confused and say, well, that's breaking the
laws of the speed of light and so on and so forth. And that's not,
strictly speaking true, because for us to understand the
impact of that, we have to take a measurement of the particle, send that measurement
over CTO, the destination or the other half of the entangled pair,
and then recover what the message was. And so, at that point,
it all slows down to the speed of light, or slightly below the speed of
light. So we don't actually break any laws of physics here, even though it can
be an interesting thing to try and prove CTO yourself, that's not happening.
So, 2d concepts here.
And then the question then becomes, can we use at least
the entanglement to be able to communicate over
a distance? And this is where teleportation come in.
So, teleportation, at its very essence, is that we have
an entangled pair. So we have managed to overlap the
wave functions of two photons. Say we've distributed them to the end of a communication
link, and now we want to introduce to that communication
link a particle that
has state on it, and we want to transmit that state across
the network. So, how do we do that? So, at the bottom of
this slide here, we have this qubit, this matter qubit that we
wish to transmit across the network. We need to entangle that matter
qubit with the communication qubit on the left hand side
of the chart here. So, in order to do that, we take a
bell state measurement of the interaction between the two particles.
That produces a measurement result. We send that measurement result over
to the counterpart pair of the
entangled pair at the other end of the communication link. We apply a localized
operation to it, and then that localized operation recovers
what the original matter qubit was as it entered the system.
And we get these teleported qubit materializing at the far
end of it. So we've not actually transmitted the state across the network
at all. And hence, that's why it teleports itself.
And so we have a mechanism, fundamentally, that we can use for communication
across any kind of distance, and that's teleportation.
Now, in order to make teleportation work,
we really need to extend how we do these
teleportation, and we do that by successively
swapping entanglement on a communication link. So, here we
have three maximum entangled pair segments,
pair a, pair b, pair c, and what we're going to do
is entangle pair a with pair b and have an
entanglement between a and b, and then do the same again to get entanglement between
c. And then we have an entanglement end to end entanglement across the entire
link. So, in order to do that we apply the same principles we
did in the previous slide. In other words, we do an entanglement
swapping measurement between the far pair of
a and the near pair of b. We send that measurement to the far pair
of b. We do a local operation on it. And then that means that a
and B, or the near pair of a and the far pair of B,
are now entangled. And these we perform the same measurement again between
the far pair of B and the near pair of C. We do an entanglement
swapping measurement there. We send the measurement result over
to the far end of C. We do a local operation there.
And these, as a result of that, we have entanglement between a and C.
So, to communicate now over that entirety of that link, we present the
matter qubit to a, and we do the same performance again.
We take a bell of state measurement of the result of that
interaction, and we send that over to c, and we recover at
c by doing a local operation,
the original matter cupid that was presented at a. So that's how
we would extend distance on
such a system. The interesting thing is,
while it appears that we have the answer in our hands, the truth of
the matter is that in order for us to be able to entangle
over a distance, we realize there's a limit to how far
we can send an entangled pair and get away with.
Rather. I think, quite honestly, my interest in quantum
communication was actually through reading this particular paper here that I'm
quoting by Suraf Kumar, and he
kind of indicated that in order for us to generate
entanglement at a certain rate, that rate will only propagate
a certain distance. So that means we have to be able to regenerate
entanglement if we wish to communicate over larger and
larger distances. So this brings the idea of a
quantum regenerator. So, in order for us to increase distance,
to increase the distance over which we're able to teleport these, we have to chain
successive segments together, and we do that chaining with quantum
regenerators. Now,
that's well and good, but that's kind of like a point to point thing.
And really, we want to get to a network where we're able to direct how
and where the entanglement goes. So we're able to pick out
a source and a destination at will and ensure that the
source and destination are entangled. And in order to do that,
we need to be able to have at our disposal
an adequate distribution rate of entanglement. In other words,
if we have successful communication, we have to be able to ensure we keep distributing
the entanglement to the ends, and therefore, we're able to do this module transportation
that we mentioned in the previous slide. But also the
swapping is also a means of us being able to route entanglement
around the network. And that's kind of the critical thing, although it overlooks
the fact I've not mentioned it specifically here. But quantum memory starts to play a
very important part in that type of activity
inside a network. So if we're able to generate
entanglement at a sufficiently high enough rate, commensurate with the communication
needs of a quantum network, and we're able to do entanglement
swapping in memory, then we're able to manipulate
entanglement around a network, and we're able to target a source and a sync.
So that's what this slide is basically saying.
So what are the challenges associated with that? Well, the challenges
are essentially twofold. There are qubit
challenges, and then there are networking challenges.
I'll address the qubit challenges first, because we are dealing
with particles, and these particles are very fragile
in nature in the sense that they start to interact with their environment,
or they're able to interact with their environment quite
easily. So as a result, there's going to
be errors in the way that these qubits behave.
And that's not unnatural, of course.
So that means that if we don't have the right
interactions when we set up the quantum circuit, then that means we're going to have
programming errors in initiating the states. And that's
one set of challenges that we're going cto face. In other words,
how faithfully can we program something these the
qubits themselves are likely to interact with the environment that
they're in, causing decoherence between the entangled particles.
And that's actually quite an important one in the sense that
we have to be very careful how we design these circuits to ensure that we
minimize that kind of exterior influence onto
the qubits that form part of the circuit.
And that usually means we have to reduce
these noise in the circuit by actually reducing the temperature of the circuit. And that's
something which we'll come to in a minute.
But at least for communications, we communicate a photon,
and photons tend to be very much at root temperatures. So that's not necessarily a
problem for quantum communications, but they are a problem if you
want to go from quantum, a photon for communication into
a matter qubit for storage and memory, say. And so there's a problem there.
These particles that we're dealing with these qubits that we're dealing with have
a lifetime, a shelf life, if you will.
They don't necessarily last forever.
In fact, they don't. The coherence times that
we're seeing, in other words, the time at which we can guarantee the
entanglement is not necessarily a long time.
Obviously, the techniques are improving, and material size is improving all the time,
and we can extend the lifetime of the coherence that these
entangled particles have. But the fact of the matter is, they have a shelf life,
and we have to ensure that all operations are done before the shelf life expires,
if you will. The other sorts of problems that
we have at this level are operation error.
So is the
gate, or is the device faithfully acting on
the qubit in the way we expect it to be? And again, because of the
interaction with the environment, that may necessarily be the truth, not because it doesn't want
to behave properly. It's because that's the way physics works at that level.
And then there are other issues as well with communication
qubits that are well known and have been known for some time, and that is
that photons can be absorbed by the fiber that they're traveling down.
And so we're going to get photon losses. And that can
also mean that we are losing state.
And if we lose one half of entangled pair,
then essentially we have a problem because we don't have an entangled pair anymore,
and we have to go and regenerate and so on and so forth. That causes
problems. Now, these are the kinds of challenges that we face
at the qubit level, but it's also important
to know that we're looking
at the health of that qubit,
too. The health of entanglement that that qubit has. And that's usually
referred to in terms of fidelity. And fidelity is a measure between zero and one
that sort of indicates how good something is,
how accurate, or how
well matched the
integral qubits happen to be. And so
it's not only that we have these errors, but we also have
to make sure that before we start any of these operations,
the qubit that we're presenting, CTO, the system, has a high fidelity
to it as well. So that's another issue that we have to try and cover
and produce mechanisms to ensure that we have high fidelity qubits
available to us. So these are the qubit challenges
from a networking perspective, there are a different set of challenges,
one we've kind of covered already, which is in order for us to
do communications? Well, we have to have qubits,
photon photonic qubits, that have high fidelity and
high purity. So if we're going to pass them through successive
gates, or if we're going to send them over any distance,
we really have to ensure that they're very healthy before they start their
journey. And that's what this high fidelity impurity really means.
We also want techniques available to us that allow
us to compensate for loss and
decoherence. And one of those techniques is quantum
error correction. If we're able to get good
quantum error correction and that prospect against decoherence,
it also protects
against quantum noise. But the problem here is that
quantum error correction requires us to have many more qubits
available to us, and so the overall population has to increase
in order for us to take advantage of that. So that generation of qubits that
we had before starts to play a more important part if we wish to go
down this quantum error correcting route. Network synchronization
is also another problem that we have to have.
We have synchronization in networks today, but these level of synchronization
that we're after is far tighter than we have in current networks,
and it needs to be tight, because we have to ensure that when we
generate entanglement, that the wave functions, we can measure accurately
or determine accurately when particles are entangled.
And that requires us to have a high degree of precision in the timing of
the network. And so also
the lifetime of the qubit starts at the
moment it's entangled. And so we have to know how much
longer we have left on some of these, on these qubits in order for
us to complete the operations that we need within the specified amount of time.
So it kind of starts to impact these way we schedule, as well as
well as acknowledging the overall health
through having an accurate measure of decoherence time on
such particles as well. So those are the network challenges.
So, is there a pragmatic way forward here?
Interestingly enough, there is a paper that came out in
2016 by Saramin
Marlutheran. I hope I pronounced his name
correctly, but he and his colleagues produced an excellent paper
that indicated that there's a generational shift.
And being engineers and scientists, of course, we magically christened these
1st, second and third generations, which is quite novel of us.
But the point is that each generation starts
to improve the efficacy of
the overall system by introducing quantum error correction
in two directions. Either it's used
to compensate for loss, or it's used to compensate
for error, operational error. So focus on
loss and operational error, and depending on how we apply, it depends on
which one of these generations you have to be. So, to put it into perspective.
So even though the paper was written in 2016, where are we today?
Today we are still scratching around at the first generation quantum repeaters,
which means that we still have a very long way to go before we get
to second and third operations, where we have a lot more
reliability into the overall communication
system. But the
community, the academic community,
and also the startup community are starting to address these problems, and this is
starting to produce some results that I think are
looking good for how quantum networking will
evolve over the next few years or so.
We know that there's a roadmap, if you will,
in terms of generational shifts for a quantum repeater that we need in order to
endtoend distance over which we can communicate.
That means that the prospect in the future is kind of interesting for us
to keep plowing on in this particular direction.
And if we do plow on and we're able to solve these problems,
then the prospect of getting to a quantum Internet starts to look kind of appealing.
Now, I need to put this into context. To get to a quantum Internet is
not like, oh, we solve something tomorrow, and then in two
years time we'll have a viable commercial product
that you can go and buy, put into your network. Everything's great. It's not quite
like that. I think we are talking maybe ten years or so before we start
to get to a reliable network at
reasonable entanglement rates that can start to teleport
at reasonable rates. Then after
that, there's going to be a few years of development before we get CTO,
a quantum Internet where we can teleport with high fidelity
over any indicated
pair that wish to communicate over such a network can do so.
So it's a long way to go for that, but we understand
the processes around it. We understand the sorts of architectures that
may evolve to cost effectively address that area, which is
all good news. And the technology and
the progress being made by various starts and very academic bodies are moving
along at a feral pace, which kind of gives me hope
that we should be able to do this in a reasonable amount of time.
Reasonable here, being an ex wireless guy, I tend to think in
ten year generational shift. So ten years or so, I think we should see staggering
progress from where we are today. So, wrapping things up.
So what's the possible end state and problems if
we solve them, or problems yet to be solved? If we carry on down this
path, what would this idealized future look like? So these diagram,
this stick diagram here that I drew kind of indicates where I think these is
going. And some of the problems that I've mentioned in
the qubit network are not necessarily the problems I'm mentioning here.
So an idealized future is something like this, that we
have a person who is interested in
writing a program for a very complex problem.
So he writes, this program. This program does not necessarily
have any knowledge of quantum networking in it whatsoever.
The program, or the application is then presented to a virtualized quantum
data center. Right? And the first
thing that this realized quantum data center modes is it breaks down
that program, these application, and it compiles
it and manages it such that it then breaks it down into these components that
are required by the classical computing
to work on that particular program, and also break down
the applicable parts of that particular program that are
requiring the attention of a quantum computer.
So that compilation is one problem area that needs to be solved.
How can I universally interpret programs
for classical quantum communication? These mix of them, if we're
able to break it down, then the next natural thing is I need to be
able to do the networking between quantum computers inside
the same data center. That's like a very different.
Well, I wouldn't say it's very different. It is networking, but it's
probably a very different networking, as we can use direct transmission between these
machines, more than likely. But the point is, as we try to extend
distance, there's more than likely a
knee point in the behavior that requires us to change from one kind of
network into a different kind of network, and which brings in the second problem that
we have, and that is to find a cost
effective way extending the rate and range of a which quantum
communication can take place. So that's the, if I'm
able to extend the range of a which
I can get quantum computers to talk cto each other, that means I
can then extend the number of quantum computers that talk to each other, and then
I get a more quantum powerful quantum computer, and so on and so forth,
which is the aim of that one. The final
one is, and I think this is the one that's going to.
Well, actually, I think a lot of these things are going to employ my colleagues
for a very long time. But the
final problem that's left these is how
can quantum data centers be virtualized in
terms of resource and the management of those resources? And so
I think the first problem, the universal interpretation, and then the virtualization
of quantum data centers, both of these, I think, are going to occupy
my colleagues that are doing quantum computer science for many years to
come. Whereas the networking pieces of it where the
internal networking of a data center, quantum data center,
rather, or the external extension of a quantum
data center, will be occupying me for quite some time.
So I've gone through quantum computing and its application
to quantum communication. I've taken you through
a whole bunch of stuff. I've taken you through a journey through the impact of
quantum computers onto current cryptography that
we use in networks today and the potential threat that there is. I've taken you
through how that same principles can be applied for quantum communication and
the benefits that could bring. I've indicated to you some of the problems that
we'll face on that particular journey, and also indicated to you
exactly how an idealized future may appear for us and
what the kind of problems are, if we're trying to address or aim to,
towards that idealized future. Anyway, I hope you've enjoyed the talk,
I hope you found it informative, and I look to hearing your questions.
Thanks very much.