IBM eyes ‘seamless integration’ of quantum, classical computing

Blending classical and quantum computing could reduce the cost of quantum calculations and eliminate the need to understand hardware specifics, IBM says.

The number of qubits in quantum computing is expanding. In November, IBM unveiled Eagle, a 127-qubit quantum processor, becoming the first quantum computing provider to break the 100-qubit barrier. The system containing the Eagle processor, called ibm_washington, is available on the cloud, says Bob Sutor, chief quantum exponent at IBM. IBM aims to have a 433-qubit processor by 2022 and a 1,121-qubit processor the following year. 

IBM currently has 23 quantum computers in the cloud with a range of processing power: 1 qubit, 5 qubits, 7 qubits, 27 qubits, 65 qubits, and 127 qubits. The larger ones are more state-of-the-art, exploratory systems, says Sutor.

IBM also recently introduced Quantum Serverless, a new programming model that aims to provide “seamless integration” for quantum and classical computers and make the quantum advantage faster.

This approach—which could be used to speed up portfolio optimization, risk management, and analytics problems, for example—is also known as hybrid quantum–classical computing, says Dimitris Angelakis, principal investigator at the Centre for Quantum Technologies at the National University of Singapore. CQT is a Singapore research center focused on the foundations of quantum physics.

“And it’s kind of a variational approach. Quantum computer spits out something that’s fed to the classical optimizer and imagine you’re looking for a minimum of a landscape, that landscape could be a financial loss of a product, or of risk trying to find the minimum. And you get the point that you go back, change your parameters and run it again, and you do that many times until you get your absolute minimum,” he says.

Banks have already started experimenting, with some turning to quantum computing for certain problems. Goldman Sachs, for example, worked with IBM to estimate the quantum computing resources needed to achieve quantum advantage in derivatives pricing.

Barclays published a whitepaper with IBM detailing how quantum computing can be used to speed up securities settlement cycles.

And a group of Dutch Banks is exploring quantum computing for regulatory stress tests.

Currently, if you want to use the ibm_washington system, for example, you would need to call a function within the code to get access to that particular machine. Once you have access, you would submit your circuit and ask the computer to execute the job on that system.

But what if you don’t know exactly how many qubits your problem or job needs? How do you know which quantum computer to call on in that case?

“Serverless” is a strange word, Sutor says. “It doesn’t mean that there are no servers. It’s almost like it’s server-nameless. You don’t know specifically the names, or maybe even the locations of the servers. But it’s basically, ‘Here are the characteristics, this is what I want to accomplish,’ and you’re precise about this. ‘Here’s where my data lives, you figure it out, you go do it. Spin up the containers, do what you’ve got to do. And then when it’s finished, bring it back to me. You take care of the memory and the disk space—whatever it is for the total job—and then just charge me whatever it costs to do that,’” he says.

Seamless integration

Quantum Serverless aims to provide users with quantum resources without the need for extensive knowledge of the hardware.

Giulio Chiribella, a professor at the department of computer science at Hong Kong University, says the idea of serverless is that when a user formulates a problem for their computer, they shouldn’t have to be an expert in quantum science or on the quantum hardware used.

He says that while he is an expert in quantum science, his level of expertise doesn’t extend to specific hardware details.

“Users want a system where they can tell a program what they want to do, a developer of software can write the code knowing the minimum amount of information about quantum computing—probably even zero. It would be like just putting high-level instructions and leaving it to the system to handle the details of how it is implemented in the hardware,” he says.

The physical details of the hardware should not matter to developers writing the code.  

“The idea is that the system optimizes behind the scenes which computer is used so that you minimize the cost, and you would pay only for the amount of time that the quantum computer is actually using the algorithm—not for the whole time of the algorithm from the input to the output, but really for the machine time of the quantum computer, which is more expensive than the time of a classical computing facility,” says Chiribella.

A lot of work is done on the side classically because quantum computers are not good at certain data processing tasks. What they are good at, however, is “finding the minima or maxima of some functions, and finding symmetries, or finding the period of a function, which are the more basic mathematical sub-routines a quantum computer would do,” says Chiribella.

A hybrid approach could mean that parts of a workload or a job are run on a classical server, and others on a quantum computer. Users would provide IBM with the input and specifications of the type of service required. It could be inputting the output for a workload that was already processed on classical cloud processing servers. And then, if a quantum computer is needed, Quantum Serverless would decide which system is best for that job. Then, the output of that could be used for another calculation on a classical server.

Chiribella says this is important because it can improve the performance of the overall system by orders of magnitude, depending on how you pre-compute and pre-process the data, and put it in the format that can be run on a quantum computer with the minimum amount of noise and the minimum use of resources, in the number of qubits and number of gates that are actually run on the computer.

IBM’s Sutor says this “seamless integration” is what IBM strives for. “It’s not classical; it’s not quantum computing. It’s computing. And we don’t run around saying, ‘CPU computing against GPU computing’—that’s just computing. So, it will take time. But you see how we’re taking what we’ve learned in the cloud or other situations and are now doing analogous things with quantum computers. And we’re at the early stages of that,” he says.

For example, Sutor says firms currently use traditional classical processors and memory data, particularly for AI and machine-learning applications. But for certain specialized calculations, they might use GPUs.

Though GPUs are well-known for providing that multi-dimensional experience in video games, it is also used for AI. In this way, GPUs are integrated with more classical non-GPU parts of certain algorithms, Sutor says.

Depending on the need, one might call a GPU to perform a particular calculation. But a classical algorithm written in Python, or C++, or any other language, drives it.

“You want to think of the framework that is driving the workflow. Let’s say you’re trying to do risk assessments or settlements. It’s being driven by classical processing, but at certain points, you may decide to use a GPU. So now, we’re going to replace GPU with quantum computing. … The idea is that you would plug in a quantum solution, and you would call it at some point in the middle of a workflow,” he says.

Breaking barriers

IBM breaking the 100-qubit barrier is a major advancement in the quantum computing space, according to Chiribella and Angelakis.

However, Angelakis says he would like to see more details and demonstrations on the chip’s capabilities. “There are other numbers apart from the number of qubits,” he says. “What are the gate fidelities? How good are the qubits, and what can we do, like in a demonstration or some sort of benchmark?”

The first realistic applications of such quantum computers will most likely be in natural sciences, like drug discovery or materials design, but that hasn’t stopped financial services firms from experimenting.

The area of optimization is where Chiribella and Angelakis believe the financial services industry will see the first quantum applications.

Angelakis says his team at CQT has been working on improving optimization algorithms to deal with a smaller number of qubits.

Real-world portfolio optimization problems have thousands of facets, he says. The quantum computer required to solve that would need to have thousands or tens of thousands of qubits. “And that’s not what we have. We managed to compress the number of qubits required and possibly attack larger problems for financial optimization and energy management,” he says.

Optimization problems—maximizing revenue and minimizing risk—are what banks have always been doing with the best tools available. Quantum computers can improve some of these tasks.

“But one needs to not be blindly looking at quantum computing as the solution to all problems. It’s more like saying, ‘Within portfolio optimization, which particular problems seem to be more suitable for quantum computers?’ We are really in a transition phase for quantum computers where the progress is big from one year to the next. But we shouldn’t confuse what we will have with what we have now. It can be useful for some specific applications, but it is not as big as it will probably be in four or five years,” Chiribella says.

Quantum ML

While portfolio optimization is where the first quantum applications for the financial sector will occur, work is also being done in the areas of AI and machine learning.

For example, Quantinuum—the merged entity of Honeywell Quantum Solutions and Cambridge Quantum—is working on quantum natural language processing (QNLP). It recently released an open-source toolkit and library for a QNLP toolkit—called lambeq—which accelerates the development of QNLP applications, such as automated dialogue, text mining, language translation, text-to-speech, language generation, and bioinformatics.

On IBM’s side, in June 2021, the European Organization for Nuclear Research (Cern) in Geneva became IBM’s newest quantum hub, which gives Cern dedicated access to IBM’s quantum computers. Cern is experimenting with quantum machine learning to better analyze raw data captured by particle detectors—devised to detect, track, and identify ionizing particles.

Chiribella says a similar application could be using quantum computing to detect patterns in financial data. “The problems are very different from the outside, but if you look at the mathematical structure, it’s pretty much the same. The computer doesn’t care if your patterns are about particles or about stock markets,” he says.

There are two types of quantum machine learning, says Angelakis. One is where classical data—like financial data—is paired with quantum algorithms to make some sort of prediction. “You can have some sort of past data to predict what’s going to happen in the future markets or use it for forecasting. Or you can try to segment customers or segment behaviors for predictions,” he says.

Then, there’s the other type, which is pairing quantum data—outputs from quantum computers—with quantum algorithms to “try to make sense of your experiments.” The problem there, however, is loading big data onto quantum machines.

“This is something that we’re not completely clear on how to make what’s called a quantum random access memory efficiently,” Angelakis says.

While quantum machine learning generates a lot of excitement, particularly in academia, it has yet to deliver useful results on a practical level, says Chiribella. But a quantum computer of more than 100 qubits, and even more in the coming years, could change the game and bring quantum machine learning applications to life—for example, in using quantum computers in pattern recognition.
 

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.

Data catalog competition heats up as spending cools

Data catalogs represent a big step toward a shopping experience in the style of Amazon.com or iTunes for market data management and procurement. Here, we take a look at the key players in this space, old and new.

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here