A new kind of quantum

So far, quantum computing has been ruled by ions and electrons, but does its future lie with photons?
01 November 2020
By Rebecca Pool

When a little-known photonic quantum computing startup, PsiQuantum, raised $150 million in its latest funding round this year, the technology community took notice. With Microsoft's venture capital arm and other prominent investors having now ploughed a mighty $215 million into the four-year-old Palo Alto company, it became one of the most well-funded startups in quantum computing history.

However, PsiQuantum turned even more heads when, after years of silence from the company, chief executive Jeremy O'Brien told Bloomberg Businessweek that his company would build a quantum computer with one million qubits—the minimum number deemed necessary for a commercial system—within "a handful of years." For context, IBM and Google only recently introduced 65 and 54 qubit systems, respectively.

So what makes PsiQuantum's technology so promising? As Terry Rudolph, co-founder and chief architect says, "Interestingly, our photons do not decohere." And when Rudolph says something is interesting, he knows what he's talking about: he's a professor of quantum physics at Imperial College London, and, if you really want to dig into his credentials, also Erwin Schrödinger's grandson.

For years, quantum computing has mostly relied on two technologies—superconductor circuits and trapped ions—to create qubits, the all-important units of quantum information.

The superconducting qubit is under development by industry heavyweights including IBM, Google, and Canada-based D-Wave. Typically based on a Josephson junction, this qubit consists of a pair of superconducting metal strips separated by a nanometer-wide gap that electrons transit, leading to quantum effects. Qubits can be coupled together and manipulated with microwave pulses to generate superposition and entanglement, the fragile quantum states that enable computation.

Meanwhile, Honeywell, IonQ, and other companies are pursuing the trapped-ion approach, which uses ionized atoms to carry quantum information. The ions are housed within specialized chips, or ion traps, that feature lithographically written electrodes to generate electromagnetic fields. These forces precisely hold each ion in place, ready for manipulation with microwave signals or lasers to create quantum states.

But, for each setup to function, it must be isolated from environmental noise to reduce errors and delay decoherence, in which the qubit quantum state collapses and information is lost. Given this, superconducting quantum computers are housed in dilution refrigerators that cool the qubits to near absolute zero, while trapped ion setups are laser-cooled and placed in ultra-high-vacuum chambers.

In addition to isolation, IBM hardware scientist Nick Bronn explains that "filter" qubits can be used to reduce noise, while researchers also employ software, including error-correction schemes, to further delay decoherence. "Our superconducting qubits are very, very fast, but do have slightly shorter coherence times as opposed to some other architectures," says Bronn.

Jungsang Kim, professor of electrical and computer engineering at Duke University, and co-founder and chief technology officer of trapped ion quantum computing startup IonQ, echoes Bronn's sentiments. "Isolation in both trapped ion and superconducting qubit systems is not perfect, so you get cross-talk between qubits, which can affect computation," he says. "However, we use an array of high-power lasers to manipulate the gates that control the qubits' state, and are working hard to ensure these gates become more perfect," he adds.

IBM hardware scientist Nick Bronn

IBM hardware scientist Nick Bronn working with a superconducting quantum computer. Credit: IBM

The hard work is paying off. Around two decades ago, IBM's first superconducting qubit retained coherence for around a nanosecond; today it reaches a few hundred microseconds. Meanwhile, IonQ's trapped ion qubit coherence time is typically a few seconds, and researchers have extended this to minutes in a recent experiment.

However, while these short bursts of quantum lucidity allow initial calculations, the one million qubit "fault-tolerant" computer—that will essentially avoid a cascade of errors—still feels a long way off. Unless, perhaps, your quantum currency is photons.

According to PsiQuantum's Rudolph, the photonic qubit is pretty stable. "The only noise in our photonic approach arises from imperfect components, and this can be well characterized," he says. "Also, our photons don't interact, so we don't get crosstalk or other difficult-to-understand error propagation, which means the basic methods of tackling noise using error-correcting codes work."

The company has focused on building a fault-tolerant architecture via error correction. "It is not an added-on feature," he says, "but an intrinsic part of our design."

But what will PsiQuantum's quantum computer look like? Despite growing from a handful of researchers to a technical team of more than 100 members in its short lifetime, little is known about the company and its technology. Indeed, community cohorts have described PsiQuantum as "quiet" and "stealthy." As Nicolas Menicucci, associate professor at the Centre for Quantum Computation and Communication Technology, RMIT University, Australia—who is also developing photon-based quantum computing—puts it, "I'm being serious when I say PsiQuantum is mysterious—the company keeps a lot of what it's doing under wraps."

Still, the company has publicly stated that its systems are based on silicon photonic chips manufactured by US semiconductor foundry GlobalFoundries, offering a relatively straightforward path to large-scale manufacturing.

"To economically and reliably produce large volumes of components, you need to be in a Tier 1 semiconductor fab," asserts Rudolph. "These suppliers produce billions of transistors for laptops and cell phones—so millions of qubits is easily achievable for them."

The qubits are encoded using single photons that travel along silicon photonic waveguides and are then entangled using networks of optical components. While Rudolph will not be drawn on qubit numbers, he is willing to say, "We can build a large distributed entangled state of many photonic qubits, with a very specific structure. By measuring these qubits using single-photon detectors, we can implement any gate sequence and then run a quantum algorithm."

More than one way to photonic quantum computing

But there is more than one way to light up quantum computing. Instead of entangling individual photons to perform calculations, entire light beams can be entangled to create a "continuous-variable cluster state" over which computations can take place.

RMIT's Menicucci is working with light in this way. As he points out, using continuous variables for quantum computing—instead of qubits made of single photons—may seem unorthodox, but has its perks.

Like a single-photon-based setup, continuous-variable cluster states are optically prepared, and can be made at room temperature. However, according to Menicucci, these states are easier to produce at scale than the single-photon approach, with the potential to process a lot of information and to nudge the technology closer to fault-tolerant quantum computing. "We're starting with extreme scalability built in from the very beginning," he says.

Menicucci has devoted more than a decade to developing continuous-variable cluster states, and saw his designs become reality just last year. Working with Akira Furusawa and a team of experimentalists from the University of Tokyo, as well as fellow researchers from University of New Mexico and University of New South Wales, Canberra, Menicucci produced a large-scale cluster state with the necessary entanglement structure for quantum computing.

In this experiment, violet laser light was directed onto four optical parametric oscillators to produce infrared "squeezed" states in each of four beams. These squeezed states were then woven together by a network of beamsplitters and optical delay lines into the final entangled continuous-variable cluster state. According to Menicucci, the amount of squeezing was too low for practical application, although homodyne detection verified that the state had been entangled sufficiently for quantum computation.

Menicucci also believes that achieving quantum entanglement in cluster states is relatively straightforward compared to the single-photon qubit approach. While PsiQuantum's Rudolph asserts that any entanglement issues in photonic qubits can be overcome by using robust error correction, Menicucci reckons the challenge for PsiQuantum is getting the photons entangled in the first place.

"All optical approaches have advantages compared to other setups," he says. "But you can't deterministically create entanglement between single photons without a lot of extra machinery. You try to create entanglement between photons by sending them through an entangling gate. Sometimes the gate works, and sometimes it doesn't," he adds. "Using continuous variables instead of single photons avoids this problem. In our case, the entangling gate always works."

Nonetheless, research continues. According to Menicucci, "Even our entanglement is never perfect, and there's always some noisiness and decoherence in the computation."

PsiQuantum’s silicon photonic chips on a wafer

PsiQuantum's silicon photonic chips on a wafer. Credit: PsiQuantum

Indeed, he and colleagues are tackling these issues by reducing optical losses, improving the level of squeezing, and designing new codes for quantum error correction. And excitingly, Canada-based startup Xanadu is following a similar path to photonic quantum computing.

Xanadu's chief executive Christian Weedbrook, who has collaborated with Menicucci in the past, agrees that optical losses and squeezing levels need some work. But as he highlights, "We are working with foundries to produce better-quality, lower-loss chips, and also use techniques such as error mitigation and suppression to combat loss further."

"Squeezing light is a challenge, but there are no fundamental issues to achieving higher levels. We've been increasing [squeezing] levels over the last couple of years," he adds.

Xanadu, formed four years ago and having already received around $45 million in investments, now expects to begin raising more funds early next year to scale activities. In a similar vein to PsiQuantum's technology, Xanadu develops photonic quantum chips based on nanophotonic silicon nitride waveguides, fabricated using standard semiconductor lithography methods. And according to Weedbrook, dealing with photonics also means Xanadu is well on the way to room temperature quantum computation. All of the company's current hardware operates at room temperature, except for the photon-counting detectors that read out the quantum states of the entangled light.

"We'll eventually have the entire computer operating at room temperature, and can do this in a variety of ways including [experimenting] with different detectors...which we are working towards over the coming years," he says.

The company has already introduced its X8 chip platform, containing eight squeezed quantum states, and intends to release X12 and X24 by the end of the year. All will be accessible over the Xanadu cloud.

"We've just launched the world's first photonic quantum cloud platform, and already have customers paying for cloud time," says Weedbrook. "One of the biggest thrills is having quantum physicists and engineers just wanting to play around with a real quantum chip and have fun."

Indeed, industry players including IBM, D-Wave, Netherlands-based Qutech, and Rigetti offer cloud services, while IonQ and other companies have joined forces with the likes of Microsoft and Amazon to make their technology available in the cloud. These services are typically being used to teach quantum mechanics, test simple algorithms, and create quantum games-but expect more soon.

IonQ's Kim says, "The quantum cloud is so important as it provides quantum computing access to a large number of people...this will stimulate people to think about how to use quantum computing in ways that conventional thinkers won't."

And Weedbrook agrees. "We're building an ecosystem of early adopters here that will really help to speed us towards quantum advantage and supremacy."

But this is where established quantum technologies have the upper hand, at least for now.

Is quantum supremacy the only goal?

In October last year, more than 75 researchers led by Google published a Nature paper detailing how they had achieved quantum supremacy using their 53-qubit superconducting processor "Sycamore." Quantum supremacy entails solving a problem that cannot realistically be solved by a traditional computer. And in this case, Sycamore had solved a random number problem in 200 seconds that the researchers claimed would take IBM's Summit supercomputer 10,000 years to complete.

IBM swiftly countered these results by calculating the problem would actually take three days using Summit. And in the weeks that followed, many researchers also highlighted how the random number problem was of little practical interest and biased towards quantum computing.

Still, as Weedbrook says, "We will look back and say this was a fundamental moment in quantum computing. Yes, Google did achieve quantum supremacy with an artificial mathematical problem that has no real business application. But this doesn't undermine how great an experiment this was."

For his part, Bronn says IBM is focusing on quantum advantage rather than quantum supremacy. "This involves building a quantum computer that can solve a real-world problem, as this is what will provide business results," he says.

What's more, he believes quantum advantage will be achieved with so-called noisy intermediate-scale quantum computers relatively soon. These systems from IBM, D-Wave and others, are seen as a halfway house towards aspirational fault-tolerant computing—they accumulate errors over time due to imperfect qubit control, but can generate accurate answers for shorter calculations.

fabricating a silicon photonic chip

Fabricating a silicon photonic chip, the building block of what PsiQuantum hopes will be the world's first useful silicon photonic quantum computer. Credit: PsiQuantum

"This is where many researchers are working, and we have several financial applications coming out of our research in Zurich," says Bronn. "We're hoping that within a decade we have an application in which a quantum computer is superior to a classical computer."

And of course, the quantum advantage/supremacy chase continues amongst all technologies. Kim reckons his flavor of trapped-ion quantum computing will reach this milestone in a few years, while Weedbrook is looking at a similar timeframe with Xanadu's photonic chip platforms.

But where does this leave PsiQuantum and its million-qubit ambition? The world can only wait. As Rudolph puts it, quantum computer developers have historically had a choice—build a large fault-tolerant computer to tackle real-world problems, or settle on a smaller noisy quantum computer, and "hope" to find something useful for it to do.

"The former choice is the harder choice that requires engineering a solution that can scale to more than one million qubits," he says. "Our architecture has focused on this fault tolerance approach from the start."

Rebecca Pool is a science and technology writer based in Lincoln, United Kingdom.

Enjoy this article?
Get similar news in your inbox
Get more stories from SPIE
Recent News
PREMIUM CONTENT
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research