Main menu


JPMorgan Chase Bets Big on Quantum Computing

featured image

Most talk about quantum computing today, at least in HPC circles, focuses on advancing technology and the hurdles that remain. There are plenty of the latter. Financial services giant JPMorgan Chase (JPMC) takes a different, distinctly user perspective, generally steering clear of the qubit technology battles and instead focusing on becoming being quantum-ready now. Quantum information science, believes JPMC, isn’t a nice-to-learn area but a must-learn. QIS will upend many existing practices and introduce new ones.

No doubt having resources of its scale ($129 billion in 2021 revenue) helps fund JPMC’s wide-ranging technology research. In the quantum area JPMC has been busily developing quantum algorithms around optimization, machine learning, natural language processing and publishing the results. Leading this effort is Marco Pistoia, a former distinguished IBM researcher who joined JPMC in 2020 as the managing director of JPMC’s Global Technology Applied Research Center (brief bio at end of article).

Pistoia presented at Tabor Communication’s annual HPC + AI Wall Street conference held last month. While his comments were focused on financial services, they also were representative of perspectives and actions being taken by potential QIS users now. These companies don’t care what the underlying quantum computing system is. They will use whatever systems become available and are tightly focused on learning how to wring competitive advantage from them. JPMC has worked with different qubit modalities, trapped ions as an example, in its work.

Marco Pistoia, JPMorgan Chase

“At JPMorgan Chase, we don’t do research in isolation. We want to get the problems, the use cases, from the company. But at the same time, we publish our results because we want to contribute and this is very important. [Quantum computing] is not yet at the stage in which it can be used in production. In fact, it cannot be used in production today,” said Pistoia. “The quantum computers are not yet powerful enough. We are at the scientific stage and when we are in a scientific stage with a certain technology, that’s the best moment to actually collaborate with other companies and publish our results.”

JPMC has been doing just that and a list of some of its recent papers are included at the end of the article along with a link to the video of the talk. As Pistoia emphasized in his presentation, “For people who work at the level of algorithms and applications, the physics of the quantum computer is not so important. We don’t need to understand how qubits work physically, we just need to know the mathematics of the qubits. There are a lot of opportunities for teams to grow with skills like math, physics, of course, is always welcome, and computer science. We have actually a very diverse team, in terms of skills and background.”

Presented here is a brief summary of Pistoia’s talk along with some of his slides.

Why Bother with Quantum?

Without digging into the mysteries of superposition and entanglement, Pistoia noted simply that quantum promises dramatic speed-up and accuracy improvements in optimization, simulation, and machine learning. And interestingly, despite expectations that science areas such as physics and chemistry would be the first to benefit from quantum computing, it turns out financial services will perhaps be first to take advantage of quantum computing said Pistoia.

“Why is that the case? The reason is that in finance, we have a many use cases [that] have exponential complexity. [The] level of complexity explodes as soon as a dataset becomes big enough and a classical computer cannot solve that problem anymore,” said Pistoia.

The current approach to solving these types of problems in all industries is using approximation techniques. “With approximations, we don’t have the exact answer. Also, in finance, time is of the essence. Unlike other industry sectors where you can afford a little bit more time; [for example] the pharmaceutical industry can afford to run a computer for three days and then get the recipe for a new drug or a new vaccine. [In that case], I think three days is perfectly reasonable. In finance, we need to get answers right away. Because the market is quickly changing. A computation that takes three days is totally useless,” he emphasized.

Pistoia cut through the haze with a simple crystal clear example.

“Suppose you have 10 people to sit around a dinner table. It’s surprising to think that you have 3,628,800 possible ways to seat just ten people around a table. Think about how many combinations you can have. Of course, there are also other things to consider like constraints. For example, you want to maximize the chances that people who like each other are sitting next to each other, and people who don’t like each other [will be] sitting far apart one from the other. This problem is very difficult and that’s with just 10 people. Imagine when we do portfolio optimization and we have not 10 assets in a portfolio but thousands of assets. This number [of possible combinations] is gigantic.”

Sufficiently reliable quantum computers – leveraging superposition and entanglement – could solve these problems much faster and more accurately than classical systems. While the current crop of quantum computers varies widely in technology used, size (qubit-count), and performance,  all of them are generally lumped into the NISQ (noisy, intermediate scale quantum) computer category. NISQ systems have many drawbacks (low qubit count, high error rates, speed, etc.) and have so far proven unready for more than proof-of-concept use. Expectations are that higher qubit-counts and improved error mitigation/correction will lead to better NISQ machines able to tackle a few narrow applications. Universal, fault-tolerant computers are much further off, perhaps a decade.

It’s the Algorithms!

Even with sufficiently reliable quantum computers, solving these problems isn’t easy. Potential users, such as JPMC, are digging into the use cases and the new and modified algorithms that will be needed for quantum computers. Pistoia reviewed recent work by JPMC on several algorithms. Here are two examples that are also depicted on the slide below (click on slide to enlarge it):

  • Optimization. “We created an algorithm called the NISQ-HHL. This was very interesting because we were able to take an algorithm that existed previously, called HHL, and augmented it with additional capabilities. HHL, technically is an algorithm for solving systems of linear equations but it can actually be used for optimization as well. Because a portfolio optimization problem, for example, can be cast as a system of linear equations. We took the HHL algorithm [and added] features and were able to execute it on one of the quantum computers that we have access to today. Before, everybody was super excited about the HHL algorithm but nobody had been able to actually execute it on a real quantum computer effectively because of its complexity. Everybody was waiting for a bigger quantum computer. We were able to execute it on one of the quantum computers that we have today and solve a very small portfolio optimization problem.”
  • Risk. “Another example is in the area of risk analysis and option pricing. We originally created an algorithm for option pricing that doesn’t have this dramatic speed up; it’s not going from exponential to polynomial like we have been able to do for portfolio optimization. But we were able to find an algorithm that reduces the complexity. How does it do it? It starts from exponential complexity and after applying quantum computing it is still exponential, but with a much smaller exponent. So the exponent is actually half the original exponents. So, for example, if the complexity was 2100, after applying a quantum computing and using this algorithm, the complexity will be 250. It’s actually a great result as well.”

These examples are quantum algorithms run on quantum computers. A different problem occurs at the application layer when input from the application must be converted into appropriate input for the quantum circuit.

Pistoia said, “Let’s go back to NISQ-HHL. It is an algorithm that solves the systems of linear equations. So we need some logic that takes a portfolio’s input and transforms it into a system of linear equations so that NISQ-HHL can solve [it]. For a risk analysis and option pricing, we needed a logic as well that takes the inputs for a risk analysis problem or a deriative pricing problem and makes the inputs suitable for the algorithm. [We realized] this logic sometimes can itself be the bottleneck – we were happy to have the algorithm that reduced the complexity by cutting the exponent in half, but realized the logic on top of the algorithm was becoming the bottleneck, negating the quantum advantage of the algorithm. So we had to create another algorithm for the input.”

Returning to the emphasis on uses cases, Pistoia said, “The important thing that I wanted to emphasize today is these are the real problems of the bank. The algorithms that we designed and we implement today are the algorithms that we will use in the future when quantum computers become capable of running in production. The only thing that we are not doing is we’re not able to digest the real datasets that the bank is using every day. So [in that sense] we’re not really solving a portfolio optimization problem that the bank is facing today, because the quantum computers are not yet big enough,” said Pistoia.

“But that’s okay. We know that. We just need to wait for the hardware to progress. Meanwhile, we’re not idle, waiting for this hardware to make progress. That’s a very important point. I think it’s important for the financial industry as a whole to realize that if a company doesn’t do anything about quantum right now, just waiting for quantum advantage to become a reality, when quantum advantage becomes real, it might be too late to catch up. Other companies will already be there,” he said

Dealing with the Post-Quantum Security Headache

No FS presentation would be complete without a discussion of the threat and opportunity quantum computing presents to data security. The threat, of course, is that when fault-tolerant universal quantum computers arrive, they are expected to be able to decrypt data that has been encrypted using current methods thanks to Shor’s algorithm. (See HPCwire coverage, The Race to Ensure Post Quantum Data Security)

NIST has cautioned against so-called ‘Harvest Now; Decrypt Later’ attacks in which bad actors capture data now and store it until later – perhaps years – when it can be decrypted by quantum computers. Financial services companies, including JPMC, are all scrambling to be ready to cope and NIST issued its first new post-quantum algorithms this summer.

“People were not super worried about it, because at the beginning, initial estimation showed that it would take a billion qubits to break a public key in one day. We’re very far away from having a billion qubits. So it sounded like we can relax a little bit. However, later, it was actually shown that you can do the same thing, you can compute somebody’s private key from the corresponding public key with only 20 million qubits – a big difference from 1 billion to 20 million, and that you will be able to do that in eight hours. This year, researchers showed that it will take only 13,436 qubits to break a cryptography system with a public and private key in 177 days, which basically is less than six months. So, you know, six months is nothing because our public and private keys definitely have a life that is much longer than that,” said Pistoia.

“Now, how far are we from these 13,436 qubits, which will be able to break cryptography in less than six months? We’re not that far away. Because if we look at the roadmaps of some companies that are leading in quantum hardware, we see that a number of qubits of that caliber that also incorporate error correction, which is another important thing that quantum computers must have, may actually be available in 2026. So I’m not saying that in 2026, our cryptography will be broken. But it might. Maybe will be later. But I think it doesn’t hurt to be cautious and think about the fact that this is going to happen at a certain point.”

Pistoia noted preparing for post-quantum cryptography is a huge effort. Indeed, many observers have noted how encryption/decryption functions have been scattered throughout IT infrastructure and that just identifying where the code exists in legacy systems is a challenge. There’s a growing call for a complete overhaul that is more modular and allows companies to readily find and swap in new encryption algorithms as they become needed.

Perhaps not surprisingly, Pistoia concluded with a call for action. He argued that waiting for quantum advantage to arrive isn’t a good approach.

“I wanted to conclude with this call for action. I think there are three things that every company should do. I mean, this is something that I have seen firsthand at JPMorgan. The first thing I mentioned before: becoming quantum ready is definitely crucial. Because we are in a very privileged moment. We know that quantum computing is arriving, but it’s not there yet. Perfect. This is the time for us to learn about it, and be ready for when quantum advantage becomes a reality.

“Now, when quantum advantage comes, it will not come at the same time. For every application, remember that I said before that for portfolio optimization we were able to achieve a dramatic speed up I called exponential. And then for option pricing, we only were able to cut the exponent in half. I think that gives an idea about the fact that some applications are going to benefit from quantum computing before others. It makes sense to identify these, like low-hanging fruit to this first wave of applications and start from those.

“And another thing that I saw is that it’s crucial to build a team because quantum computing is a very specialized technology [and] it’s crucial to have a team that understands quantum computing. It’s also important that this team is not totally centralized in the sense that it cannot work in isolation but must work with the rest of the company.”

Link to video of Pistoia’s talk,

Recent Papers
Constrained Quantum Optimization for Extractive Summarization on a Trapped-ion Quantum Computer
NISQ-HHL: Portfolio Optimization for Near-Term Quantum Hardware
The Efficient Preparation of Normal Distributions in Quantum Registers
Option Pricing using Quantum Computers
Universal Quantum Speedup for Branch-and-Bound, Branch-and-Cut, and Tree-Search Algorithms
Importance of Kernel Bandwidth in Quantum Machine Learning
Bandwidth Enables Generalization in Quantum Kernel Models

Brief Bio of Pistoia
Marco Pistoia, Ph.D. is Managing Director, Distinguished Engineer, and Head of JPMorgan Chase’s Global Technology Applied Research (formerly Future Lab for Applied Research and Engineering), where he leads research in Quantum Computing, Quantum Communication, Cloud Networking, Augmented and Virtual Reality (AR/VR), Internet of Things (IoT) and Blockchain and Cryptography. He joined JPMorgan Chase in January 2020. Formerly, he was a Senior Manager, Distinguished Research Staff Member and Master Inventor at the IBM Thomas J. Watson Research Center in New York, where he managed an international team of researchers responsible for Quantum Computing Algorithms and Applications. He is the inventor of over 250 patents, granted by the U.S. Patent and Trademark Office, and over 300 patent-pending applications. Over 40 of his patents are in the area of Quantum Computing.

Dr. Pistoia received his Ph.D. in Mathematics from New York University in May 2005.  He is the lead author of two printed books: Enterprise Java Security (published by Addison-Wesley in English and by Tsinghua University Press in Chinese) and Java 2 Network Security (published by Prentice Hall), both used as textbooks in many universities worldwide. He is also a coauthor of the online textbook Learn Quantum Computation using Qiskit, published in 2020.