Sign Up for the Newsletter
The most up-to-date news and insights into the latest emerging technologies ... delivered right to your inbox!
Connects decision-makers and solutions creators to what's next in quantum computing
Q&A with Photonic’s Stephanie Simmons and Paul Terry
December 6, 2023
Photonic Inc. is aiming to build one of the world’s first scalable, fault-tolerant and unified quantum computing platforms using silicon-integrated photonics and quantum optics. The company uses quantum low-density parity check (LDPC) error correction codes, similar to those used in 5G mobile communications, to enable its networked architecture.
Photonic recently partnered with Microsoft to make its technology available on Microsoft's Azure platform and raised an investment round of $100 million from several investors including Microsoft.
In this Q&A, Photonic founder and chief quantum officer Stephanie Simmons and CEO Paul Terry put forth their vision for networked, fault-tolerant scalable quantum computing driven by business use cases.
Enter Quantum: What does Photonic do?
Stephanie Simmons: We're slightly different from many of the other attempts to commercialize quantum physics in the sense that we're working backward from what a large-scale, reliable, fault-tolerant quantum computer has to look like.
First and foremost, we're designing as we would for classical supercomputers, where it's not just one monolithic supercomputer, you have lots of them, and they're all connected to the internet. We're applying those lessons to the very basic building blocks for quantum. Instead of trying to squeeze ever more qubits on the chip, we’re building a platform that can natively relate to other quantum processors.
What is the aim of your collaboration with Microsoft?
Simmons: The reason we reached out to them was we think about quantum very similarly. For many years they have emphasized the importance of end-user value – use cases where you could specifically point to product market fit and not just guess.
One of the things they consider to be the most exciting near-term use case for reliable quantum technologies is materials chemistry which works on the physical and chemical level so quantum is a tool that's fit for purpose.
We are co-designing for that success case, thinking very carefully about what users need to unlock new capabilities and then designing for that, rather than trying to build something experimental and put it out into the world.
One of the steps toward practical quantum is fault tolerance. You recently announced a new low-density parity check (LDPC) error correction code. How does that contribute to error-corrected quantum?
Paul Terry: Think of the architecture as having a qubit on the end of a fiber going into a grid switch. That has two advantages: one is that we can break out of the physical constraint of a cryostat and the second one is you can create topologies of qubits that are not flat anymore.
Most technologies have flat qubits on a plane that are connected, whereas our architecture has all these qubits on the endpoint of a network and you can connect any qubit to any other qubit.
This topology allows a class of code called LDPC codes, which allow us to do things you could never do if you were physically constrained to a flat surface. Removing these physical constraints gives us access to these LDPC codes.
LDPC codes aren’t new, they are the backbone of 5G, so all of your phones have LDPC codes to get rid of errors in wireless communications. We're using the same technologies, but now in quantum communications, and that could bring fault tolerance forward by five years.
How does your architecture support scaling?
Terry: Supercomputers changed when we stopped making ever bigger chips and instead made computers that are connected – modern supercomputers are networks of chips.
You can make a qubit out of all sorts of things, and you need to pick one that you can scale and that has manufacturability, then fault tolerance is the goal. Humanity is very good at making things in silicon. We put billions of transistors on a chip, and the T center – which is metaphorically a transistor that sits in a silicon chip – natively connects via photons to any other T center through a standard fiber.
Now we're moving away from the idea that we’ve got 60, 70, 100 qubits – the number of qubits is secondary to the fact that ultimately, we can distribute qubits, connect them, create topologies and make them fault tolerant. The Microsoft relationship is to run quantum services, which you need to offer value to an end user.
What’s on Photonic’s quantum road map for 2024?
Simmons: It’s not just about the race to deliver scalable, reliable quantum, although that's a massive global race right now. The other thing worth emphasizing is the opportunity to work within users' code to develop specific use cases.
That's one of the core reasons why we're so excited to announce the code development collaboration with Microsoft to unlock as much of that as we can in advance of these technologies coming online.
You May Also Like