Published:
|
Updated:

Publication Note: This is an update to the original quantum computing survey article that was published in 2021. This analysis includes developments through early 2025. Some referenced developments may be forthcoming at time of reading. All performance metrics and capabilities described reflect the rapidly evolving nature of quantum computing technology.
Contents
- Contents
- Abstract
- Introduction
- A Brief Primer on Quantum Computing Technology
- How Quantum Computing Could Change the World
- Industry Structure and Major Players
- Potential First-Generation Applications of Quantum Computing
- How Quantum Computing Could Change Finance
- What to Expect Next for Quantum Computing
- Conclusions and Recommendations
- Glossary of Key Terms
- References
Abstract
Quantum computers operate fundamentally differently than classical computers. Due to the quantum effects known as superposition and entanglement, quantum bits (qubits) can exist in non-binary states represented by complex numbers. This facilitates computational solutions to mathematical problems that cannot be solved by classical computers because they require sequentially computing an astronomical number of combinations or permutations.
This ability of quantum computers means that they particularly excel at optimization problems, where the optimal combination is only found after trying out an enormous number of possible combinations. Several important problems in finance are in essence optimization problems which meet this description. The portfolio-optimization problem in finance is one good example of such a problem. Asset pricing, credit-scoring, and Monte Carlo-type risk analysis are other examples. Quantum annealing systems are demonstrating practical advantages, with D-Wave’s Advantage2 system processing over 20.6 million optimization problems (cumulative since June 2022 across prototype and production systems), showing 134% usage growth in six months [25][26].
The calculating power of a quantum computer grows exponentially with the number of qubits. While quantum-computing roadmaps cite the number of qubits or competing metrics to indicate the rising power of these machines, significant technical challenges remain. The industry continues to operate in the Noisy Intermediate-Scale Quantum (NISQ) era, though important progress has been made. IBM has demonstrated systems connecting three chips to achieve 4,158 total qubits (as of 2025) [4], while Google’s Willow processor represents advances in error correction techniques (announced December 2024) [1].
The global quantum investment landscape has intensified dramatically. China announced a 1 trillion yuan ($138 billion) fund for emerging technologies including quantum computing, AI, semiconductors, and renewable energy (March 2025) [20], with quantum-specific investment estimated at approximately $15 billion. Meanwhile, JPMorgan invested $100 million in quantum company Quantinuum (2024) [13]. NIST released post-quantum cryptography standards in August 2024, urging immediate adoption by organizations [31].
Current quantum applications in finance primarily utilize hybrid quantum-classical approaches. These hybrid methods break large portfolio problems into manageable subproblems that can be solved on current quantum hardware [8]. Major financial institutions are transitioning from proof-of-concept experiments to pilot deployments, particularly for portfolio optimization using quantum annealing. Financial services companies that have not yet begun quantum initiatives face growing competitive disadvantage as early adopters gain operational experience with the technology.
TABLE 1: EXECUTIVE SUMMARY TABLE
Quantum Computing in Finance: Key Metrics Summary
| Key Metric | Current State (2025) | Near-term Target (2027) | Long-term Vision (2032+) |
| Largest Systems | 4,400+ qubits (D-Wave) [25] | 10,000+ physical qubits [29] | 1M+ qubits |
| Error Rates | 90% reduction achieved [3] | Partial error correction | Full fault tolerance |
| Finance Applications | Portfolio optimization pilots [8] | Production deployments | Quantum-native products |
| Investment Required | $100M+ for leaders [13] | $500M+ for competitiveness | Industry transformation |
| ROI Timeline | 2-3 years | 1-2 years | Immediate |
Introduction
Quantum computing exploits quantum mechanics, the properties and behavior of fundamental particles at the subatomic level, as predicted by our best current understanding of quantum physics. The goal of quantum computing is to build hardware and develop suitable algorithms that process information in ways that are superior to so-called classical computers, i.e. the ubiquitous digital computers that the Information Age was built on.
The essential elements of a quantum computer were postulated in the early 1980s, but of late work in this area has accelerated with several large established companies and start-ups building quantum-computing hardware. An even larger ecosystem of software platforms and solution providers exist around the hardware providers. Collaboration models such as alliances and partnerships are common. Many universities are involved, while governments are also supporting quantum-computing research.
Typical of a new industry, standards and metrics are still in flux, and competing architectures, which leverage different mechanisms and implementations of quantum principles, vie for technical supremacy and investment dollars. Announcements of new breakthroughs are made almost daily, which makes it important to distinguish the hype from real progress.
This paper attempts to demystify the technology, by explaining the basic principles of quantum computing and the competing technologies vying for quantum supremacy. An overview of the current quantum computing industry and the main players is provided, as well as a look at the first applications and the different industries that could benefit. The focus then turns to the finance industry, with an overview of the most important computational problems in finance that lends themselves to quantum computing, with a deeper dive into portfolio optimization. Notable recent case studies and their participants are reviewed. The paper concludes with an assessment of the current state of quantum computing and the business impact that can be expected in the short and medium term.
A Brief Primer on Quantum Computing Technology
How Quantum Computing Works and Where It Holds Advantages or Not
What we now call classical (or conventional) digital computers perform all their calculations in an aggregate of individual bits that are either 0 or 1 in value, because they are implemented by transistors that are each either switched completely on or off. This is called binary logic [33], which is the essence of any digital computer, and implemented in a longstanding computer-science paradigm originating with Turing and Von Neumann. Conventional computers operate by switching billions of little transistors on and off, with all state changes governed by the computer’s clock cycle. With n transistors, there are 2^n possible states for the computer to be in at any given time. Importantly, the computer can only be in one of these states at a time. Digital computers are highly complex with typical computer chips holding 20×10^19 bits, yet incredibly reliable at the semiconductor level with fewer than one error in 10^24 operations. (Software and mechanical-related errors are far more common in computers.)
Analog computers precede digital computers. In contrast to digital computers, classical analog computers perform calculations with electrical parameters (voltage or current) that take a full range of values along a continuous linear scale. Analog computers do not necessarily need to be electrical – they can be mechanical too, such as the first ones built by the ancient Greeks [34] – but the most sophisticated ones from the 20th century ones were electrical. Unlike digital computers, analog computers do not need a clock cycle, and all values change continuously. Before the digital revolution was enabled through the mass integration of transistors on chips, analog computers were used in several applications, for example, to calculate flight trajectories or in early autopilot systems. But since the 1960s analog computers have largely fallen into disuse due to the dominance of digital computers over the last few decades.

Both classical digital and analog computers are at their core electrical devices, in the sense that they perform logic operations that are reflected by the electrical state of devices, typically semiconductor devices such as transistors (or vacuum tubes for mid-20th century analog computers), which comes about because of voltage differences and current flow. Current flow is physically manifested in terms of the flow of electrons in an electrical circuit [35].
Quantum computers, on the other hand, directly exploit the strange and counterintuitive behavior of sub-atomic particles (electrons, nuclei or photons) as predicted by quantum theory to implement a new type of mathematics. In a quantum computer, quantum bits called qubits can be measured as |0⟩ or |1⟩, which are the quantum equivalents of the binary 0 and 1 in classical computers. However, due to a quantum property called superposition, qubits can be non-binary in a superposition state and interact with one another in that state during processing. It is this special property that allows quantum computers to theoretically offer exponentially more processing power than classical computers in some applications. Once the processing is complete, the result can only be measured in the binary states, |0⟩ or |1⟩, because superpositioning is always collapsed by the measurement process.
Because of another curious quantum property called entanglement, the behavior of two or more quantum objects is correlated even if they are physically separated. According to the laws of quantum mechanics, this pattern is consistent whether a millimeter or kilometer or an astronomical distance separates them [36]. While one qubit is situated in a superposition between two basis states, 10 qubits utilizing entanglement, could be in a superposition of 1,024 basis states.
Unlike the linearity of classical computers, the calculating power of a quantum computer grows exponentially with the number of qubits. It is this ability that gives quantum computers the extraordinary power of processing a huge number of possible outcomes simultaneously. When in the unobserved state of superposition, n qubits can contain the same amount of information as 2^n classical bits. So, four qubits are equivalent to 16 classical bits, which might not sound like a big improvement. But 16 qubits are equivalent to 85,536 classical bits, and 300 qubits can contain more states than all the atoms estimated to be in the universe. That is not only an astronomical number; it is beyond astronomical. This exponential effect is why there is so much hope for the future of quantum computing. With single- or double-digit numbers of qubits, the advantage over classical computing is not immediately clear, but the power of quantum computing scales exponentially beyond that in ways that are truly hard to imagine. This explains why there is so much anticipation about the technology exploding once a certain number of qubits have been reached in a reliable quantum computer.
However, to reliably encode information and expect it to be returned upon measurement, there are only two acceptable states for a qubit: |0⟩ and |1⟩. This means a qubit can only store 1 bit of information at a time. Even with many qubits, the scaling of information storage doesn’t improve beyond what you’d get classically: ten qubits can store 10 bits of information and one thousand qubits can store 1,000 bits. Because a qubit can only be measured in one of these two states, qubits cannot store any more data than conventional computer bits. There is thus no quantum advantage in data storage. The advantage is in information processing, and that advantage comes from the special quantum properties of a qubit – that it can occupy a superposition of states when not being measured.
Another point to keep in mind is that due to probabilistic waveform properties of qubits, quantum computers do not typically deliver one answer, but rather a narrow range of possible answers. Multiple runs of the same calculation can further narrow the range, but at the expense of lessening speed gains.
Classical computers will not be replaced by quantum computers. A primary reason for this is that quantum computers cannot run the “if/then/else” logic functions that are a cornerstone of the classical Von Neumann computer architecture. Instead quantum computers will be used alongside classical computers to solve those problems that they are particularly good at, such as optimization problems.
The strengths of quantum computers in simultaneous calculations mean that they excel at finding optimal solutions to problems with a large number of variables, where the optimal combination is only found after trying out an enormous number of possible combinations or permutations. Such problems are found, for example, in optimizing any portfolio composition, or trying out millions of possible new molecular combinations for drugs, or in routing many aircraft between many hubs. In such problems there are typically 2^n possibilities and they all have to be tried out to find an optimal solution. If there are 100 elements to combine, it becomes a 2^100 computation, which is almost impossible to solve with a classical computer but a 100-qubit computer could theoretically solve it in one operation.
Quite a few hard problems in finance are in essence optimization problems and therefore meet the description of problems that can be solved by quantum computers. The portfolio-optimization problem in finance is one good example of such a problem. Asset pricing, credit-scoring, and Monte Carlo-type risk analysis are other examples. That explains the keen interest of the finance industry in quantum solutions. The finance industry is also well positioned to be an early adopter, because financial algorithms are much quicker to deploy than algorithms that drive industrial or other physical processes.
Making a Quantum Computer
A quantum computer architecture can be seen as a stack with the following typical layers:
- At the bottom is the actual quantum hardware (usually held at near-absolute zero temperatures to minimize thermal noise, and/or in a vacuum)
- The next level up comprises the control systems that regulate the quantum hardware and enable the calculation
- Above those comes the software layer that implements the algorithms (and in future, also will do the error correction). It includes a quantum-classical interface that compiles source code into executable programs
- The top of the stack comprises the wider variety of services to utilize the quantum computer, e.g. the operating systems and software platforms that help translate real-life problems into a format suitable for quantum computing
There are many different ways to physically realize qubits—from using trapped calcium ions to superconducting structures [37]. In each case, quantum states are being manipulated to perform calculations. Quantum computers can entangle qubits by passing them through quantum logic gates. For example, a “CNOT” (conditional NOT) gate flips—or doesn’t flip—a qubit based on the state of another qubit. Stringing multiple quantum logic gates together creates a quantum circuit.
The designers of quantum computers need to master and control both superposition and entanglement
Without superposition, qubits would behave like classical bits, and would not be in the multiple states that allow quantum programmers to run the equivalent of many calculations at once. Without entanglement, the qubits would sit in superposition without generating additional insight by interacting. No calculation would take place because the state of each qubit would remain independent from the others. The key to creating business value from qubits is to manage superposition and entanglement effectively [38].
The simplest and most typical physical properties that can serve as a qubit is the electron’s internal angular momentum, spin for short. It has the quantum property of having only two possible projections on any coordinate axis, +1/2 or -1/2 in units of the Planck constant. For any chosen axis the two basic quantum states of the electron’s spin can be denoted as ↑ (up) or ↓ (down). But these are not the only states possible for a quantum bit, because the spin state of an electron is described by a quantum-mechanical wave function. That function includes two complex [39] numbers, called quantum amplitudes, α and β, each with its own magnitude. The rules of quantum mechanics dictate that α²+β²=1. Both α and β have real and imaginary parts. The squared magnitudes α² and β² correspond to the probabilities of the spin of the electron to be in the basic states ↑ or ↓ when they are measured. Since those are the only two outcomes possible, their squared magnitudes must equal 1. In contrast to a classical bit, which can only be in one of its two binary states, a qubit can be in any continuum of possible states, as defined by the quantum amplitudes α and β. In the popular press this is often explained by the oversimplified, and somewhat mystical, statement that a qubit can exist simultaneously in both its ↑ or ↓ states. That is analogous to saying that a plane flying northwest is simultaneously flying both west and north, which is not incorrect strictly speaking, but not a particularly helpful mental model either.
Because a qubit can only be measured in one of these two states, qubits cannot store any more data than conventional computer bits. There is thus no quantum advantage in data storage. The advantage is in information processing, and that advantage comes from the special quantum properties of a qubit meaning it can occupy a superposition of states when not being measured. During computation, qubits can interact with one another while in their superposition state. For example, a set of 6 qubits can occupy any linear combination of all the 2^6 = 64 different length 6-bit strings. With 64 continuous variables describing this state, the space of configurations available to a quantum computer during a calculation is much greater than a classical one. The measurement limitations of storing information do not apply during the runtime execution of a quantum algorithm: During processing every qubit in a quantum algorithm can occupy a superposition. Thus, in a superposition state, every possible bit string (in this example, 2^6 = 64 different strings) can be combined. Each bit string in the superposition has an independent complex number coefficient with a magnitude (A) and a phase (θ):
αᵢ = Aᵢ*e^(iθᵢ)
A modern digital computer, with billions of transistors in its processors, typically has 64 bits, not 6 as in our quantum example above. This allows it to consider 64 bits at once, which allows for 2^64 states. While 2^64 is a large number, equal to approximately 2 x 10^19, quantum computing can offer much more. The space of continuous states of quantum computers is much larger than the space of classical bit states. That is because the possibility of many particles interacting at the quantum level to form a common wave function, allowing changes in one particle to affect all others instantaneously and in a well-ordered manner. That is akin to massive parallel computing, which can beat classical multicore systems.
Quantum computing operations can mostly be handled according to the standard rules of linear algebra, in particular matrix multiplication. The quantum state is represented by a state vector [40] written in matrix form, and the gates in the quantum circuit (whereby the calculations are executed) are represented as matrices too. Multiplying a state vector by a gate matrix yields another state vector. Recent progress has been made to use quantum algorithms to crack non-linear equations, by using techniques that disguise non-linear systems as linear ones [41].
Key Quantum-Computer-Science Concepts
The possibility of quantum computing was raised by Caltech physicist, Richard Feynman, in 1981. The person considered by most to be the founder of quantum computing, David Deutsch, first defined a quantum computer in a seminal paper in 1985 [42].
In 1994, a Bell Labs mathematician, Peter Shor, developed a quantum computing algorithm that can efficiently decompose any integer number into its prime factors [43]. It has since become known as the Shor algorithm and has great significance for quantum computing. Shor’s algorithm was a purely theoretical exercise at the time, but it anticipated that a hypothetical quantum computer could one day solve NP-hard problems of the type used as the basis for modern cryptography. Shor’s algorithm relies on the special properties of a quantum machine. While the most efficient classical factoring algorithm, known as the general number field sieve, uses an exponential function of a constant x d^(1/3) to factor an integer with d digits; Shor’s algorithm can do that by executing a runtime function that is only a polynomial function, namely a constant x d^3. Accordingly, classical computers are limited to factoring integers with only a few hundred digits, which is why using integers in the thousands in cryptography keys is considered to make for practically unbreakable codes. But a quantum computer using the Kitaev version of Shor’s algorithm only needs 10d qubits, and will have a runtime roughly equal to d^3 [44].
In summary, the Shor algorithm means that a quantum computer can solve an NP-hard mathematical problem in polynomial time that classical computers can only solve in exponential time [45]. Therefore, Shor’s algorithm can demonstrate by how much quantum computing can improve processing time over classical computing. While a full-scale quantum computer with the thousands of qubits needed to employ Shor’s algorithm in practice to crack codes is not yet available, many players are working towards machines of that size.
Another important early quantum algorithm is Grover’s algorithm, a search algorithm which finds a particular register in an unordered database. This problem can be visualized as a phonebook with N names arranged in completely random order. In order to find someone’s phone number with a probability of ½, any classical algorithm (whether deterministic or probabilistic) will need to look at a minimum of N/2 names. But the quantum algorithm needs only O(√N) steps [46]. This algorithm can also be adapted for optimization problems.
Most quantum calculations are performed in what is called a quantum circuit. The quantum circuit is a series of quantum gates that operate on a system of qubits. Each quantum gate has inputs and outputs and operates akin to the hardware logic gates in classical digital computers. Like digital logic gates, the quantum gates are connected sequentially to implement quantum algorithms.
Quantum algorithms are algorithms that run on quantum computers, and which are structured to use the unique properties of quantum mechanics, such as superposition or quantum entanglement, to solve particular problem statements. Major quantum algorithms include the quantum evolutionary algorithm (QEA), the quantum particle swarm optimization algorithm (QPSO), the quantum annealing algorithm (QAA), the quantum neural network (QNN), the quantum Bayesian network (QBN), the quantum wavelet transform (QWT), and the quantum clustering algorithm (QC) [47]. A comprehensive catalog of quantum algorithms can be found online in the Quantum Algorithm Zoo [48].
Quantum software is the umbrella term used to describe the full collection of quantum computer instructions, from hardware-related code, to compilers, to circuits, all algorithms and workflow software.
Quantum annealing [49] is an alternative model to circuit-based algorithms, as it is not built up out of gates. Quantum annealing naturally returns low-energy solutions by utilizing a fundamental law of physics that any system will tend to seek its minimum state. In the case of optimization problems, quantum annealing uses quantum physics to find the minimum energy state of the problem, which equates to the optimal or near-optimal combination of its constituent elements [50].
An Ising machine is a non-circuit alternative that works for optimization problems specifically. In the Ising model, the energy from interactions between the spins of every pair of electrons in a collection of atoms is summed. Since the amount of energy depends on whether spins are aligned or not, the total energy of the collection depends on the direction in which each spin in the system points. The general Ising optimization problem is determining in which state the spins should be so that the total energy of the system is minimized. To use the Ising model for optimization requires mapping parameters of the original optimization problem, such as an optimal route for the Traveling Salesman [51], into a representative set of spins, and to define how the spins influence one another [52].
Hybrid computing typically entails transferring the problem (say optimization) into a quantum algorithm, of which the first iteration is run on a quantum computer. This provides a very fast answer, but only a rough assessment of the valid total solution space. The refined answer is then found with a powerful classical computer, which only has to examine a subset of the original solution space [53].
The Fundamental Challenges
The Achilles heel of the quantum computer is the loss of coherence, or decoherence, caused by mechanical (vibration), thermal (temperature fluctuations), or electromagnetic disturbance of the subatomic particles used as qubits. Until the technology improves, various workarounds are needed. Commonly algorithms are designed to reduce the number of gates in an attempt to finish execution before decoherence and other sources of errors can corrupt the results [54]. This often entails a hybrid computing scheme which moves as much work as possible from the quantum computer to classical computers.
Current estimates by experts are that truly useful quantum computers would need to be between 1,000 and 100,000 qubits [92]. However, quantum-computing skeptics such as Mikhail Dyakonov, a noted quantum physicist, point out that the enormous number of continuous parameters that would describe the state of a useful quantum computer might also be its Achilles heel. Taking the low end of a 1,000 qubits machine, would imply a quantum computer with 2^1,000 parameters describing its state at any moment. That is roughly 10^300, a number greater than the number of subatomic particles in the universe: “A useful quantum computer needs to process a set of continuous parameters that is larger than the number of subatomic particles in the observable universe.” [55] How would error control be done for 10^300 continuous parameters? According to quantum-computing theorists the threshold theorem proves that it can be done. Their argument is that once the error per qubit per quantum gate is below a certain threshold value, indefinitely long quantum computation becomes possible, at a cost of substantially increasing the number of qubits needed. The extra qubits are needed to handle errors by forming logical qubits using multiple physical qubits. (This is a bit like error correction in current telecom systems, which use extra bits to validate data.) But that greatly increases the number of physical qubits to handle, which as we have seen, are already more than astronomical. At the very least, this brings into perspective the magnitude of the technological problems that scientists and engineers will have to overcome.
To put the comparative size of the quantum error-correction problem in practical terms: For a typical 3-Volt CMOS logic circuit used in classical digital computers, a binary 0 would be any voltage measured between 0V and 1V, while a binary 1 would be any voltage measured between 2V and 3V. Thus when e.g. 0.5V of noise is added to the signal for binary 0, the measurement would be 0.5V which would still correctly indicate a binary value of 0. For this reason, digital computers are very robust to noise. However, for a typical qubit, the difference in energy between a zero and a one is just 10^-24 Joules—one ten-trillionth as much energy as an X-ray photon. Error correction is one of the biggest hurdles to overcome in quantum computing, the concern being that it will impose such a huge overhead, in terms of auxiliary calculations, that it will make it very hard to scale quantum computers.
After Dyakonov published the skeptic’s viewpoint two years ago, a vigorous debate followed [56]. A typical response to the skeptic’s case comes from an industry-insider, Richard Versluis, systems architect at QuTech, a Dutch quantum collaboration. Versluis acknowledges the engineering challenges to control a quantum computer and to make sure its state is not affected. However, he states that the challenge is to make sure that the control signals and qubits perform as desired. Major sources of potential errors are quantum rotations that are not perfectly accurate, and decoherence as qubits lose their entanglement and the information they contain. Versluis goes on to define a five-layered quantum computer architecture that he believes will be up to the task. From top to bottom, the layers are 1. Application layer, 2. Classical processing, 3. Digital processing, 4. Analog processing, and 5. Quantum processing. Together the digital-, analog-, and quantum-processing layers comprise the quantum processing unit (QPU). But Versluis also has to acknowledge that quantum error correction could solve the fundamental problem of decoherence only at the expense of 100 to 10,000 error-correcting physical qubits per logical (calculating) qubit. Furthermore, each of these millions of qubits will need to be controlled by continuous analog signals. And the biggest challenge of all is doing the thousands of measurements per second in a way that they do not disturb quantum information (which must remain unknown until the end of the calculation), while catching and correcting errors. The current paradigm of measuring all qubits with analog signals will not scale up to larger machines, and a major advance in the technology will be required [57].
Most experts agree that we will have to live with quantum computers over the next few years that will have high levels of errors that go uncorrected. There is even an accepted industry term and acronym for such quantum computers: NISQ (Noisy Intermediate-Scale Quantum) devices. The NISQ era is expected to last for the next five years at least, bar any major breakthroughs that might shorten that timeline.
Current Status of Essential Building Blocks and Systems
Once critical technical breakthroughs are made, quantum computing adoption may happen faster than expected due to the prevalence of cloud computing. Making quantum computing services easily accessible over the cloud speeds both adoption and learning. It has the added advantage that it forces hardware makers to focus on building quantum computers with a high percentage of uptime, so as to ensure continued availability over the cloud.
2025 has been a transformative year for quantum computing, with major advancements across hardware, error correction, AI integration, and quantum networks [3]. However, it’s important to note that the industry remains in the NISQ era, with full fault-tolerant quantum computing still years away. Key recent developments include:
Hardware Advances (as of early 2025)
- IBM’s Kookaburra system demonstrated connecting three chips into a 4,158-qubit system [4]
- Google’s Neutral-Atom Quantum System achieves 99.5% fidelity with rubidium atoms [3]
- D-Wave’s Advantage2 system features 4,400+ qubits with improved coherence and connectivity [25]
Error Correction Progress
- Google unveiled its Willow processor (December 2024), described as a significant achievement in error correction [1]
- AWS’s Ocelot chip reduces error correction costs by 90% (February 2025) [3]
- Quantinuum’s Apollo system achieved a quantum volume over two million (as of 2024) [2]
We now have quantum computers able to perform tasks beyond the reach of classical systems [5], though these are typically specialized problems rather than general-purpose computing. Most current applications utilize hybrid quantum-classical approaches rather than pure quantum advantage.
Classical Computing Competition
It’s crucial to note that classical computing continues to advance rapidly:
- Tensor network methods now compete with quantum approaches for certain problems
- GPU advances are changing performance baselines
- Quantum-inspired classical algorithms are achieving surprising results
Most quantum computer makers already offer cloud access to their latest systems. There are programming environments – software development kits (SDKs) that facilitate the building of quantum circuits – available over the cloud for quantum programmers to learn how to write the software that unleashes the capabilities of quantum computing, and to experiment with it. As more functionality is added to the hardware, these SDKs are continually updated.
The implication is that a whole ecosystem is being brought up to speed on how to make the best use of a quantum capability that does not quite exist yet in its full form. An analogy would be having had flight simulators to train future pilots while the Wright brothers were still figuring out how to keep their plane in the air for more than a few hundred feet. The upside of this approach is that any real advances in making reliable quantum computers with capabilities superior to classical computers will be very quickly exploited by real-world applications. This situation is in contrast to most major technological breakthroughs we have seen in the past. For example, it took a generation or two for industrial engineers to learn how to properly use electrical power in the place of steam power in factories. More recently, it took a generation to fully exploit the capabilities of digital computing in business and elsewhere. But in the case of quantum computing, all the knowledge building in anticipation of a successful quantum computer could be rapidly translated into applications by a corps of developers who are all trained up and ready to “fly the plane” once it is finally built. That is the optimistic perspective.
Quantum circuits are already being developed using quantum programming languages and so-called quantum development kits (QDKs) [58], such as Qiskit by IBM and Google Cirq based on Python; and Q# by Microsoft based on the C# language. The next step is to develop libraries and workflows for different application domains. Examples of the former are IBM’s Aqua and Q# libraries. Examples of the latter are D-Wave’s Ocean development tool kit for hybrid quantum-classical applications and to translate quantum optimization problems into quantum circuits; or Zapata’s Orquestra to compose, run and analyze quantum workflows. On top of the circuits and libraries come the domain-specific application platforms. “Orchestrating and integrating classical and quantum workflows to solve real problems with hybrid quantum-classical algorithms is the name of the game for the next few years.” [59]
Quantum-inspired software is already in operation, because these applications run on classical computers and not on quantum machines. A major example is Fujitsu Quantum-Inspired Digital Annealer Services [60]. Even on a theoretical level, quantum ideas have already been fruitful in several problem areas, where restructuring problems using quantum principles have resulted in improved algorithms, proofs, and refuting erroneous old algorithms [61]. Quantum-inspired software is closely related to quantum-ready software, which can be run on suitable quantum computers once they are available.
How Quantum Computing Could Change the World
The industrialization of quantum computers has entered a critical period. Major countries and leading enterprises in the world are investing huge human and material resources to advance research in quantum computing.
The Race for Quantum Supremacy
Google used the term quantum supremacy in October 2019 when it announced the results of its “quantum supremacy experiment” in a blog [62] and an article in Nature [63]. The experiment used Google’s 54-qubit processor, named “Sycamore,” to perform a contrived benchmark test in 200 seconds that would take the fastest supercomputer 10,000 years to do. However, it’s important to understand that this was a specialized problem designed to showcase quantum capabilities, not a practical application.
Quantum supremacy was originally defined by Caltech’s John Preskill [64] as the point at which the capabilities of a quantum computer exceed those of any available classical computer for a specific task; the latter is usually understood to be the most advanced supercomputer built on classical architecture. It’s crucial to distinguish between:
- Quantum Supremacy: Solving any problem faster than classical computers (even contrived ones)
- Quantum Advantage: Providing practical business value compared to classical approaches
This led IBM-researchers to formulate the concept of quantum volume (QV) in 2017. More QV means a more powerful computer, but QV cannot be increased by increasing only the number of qubits. QV is a hardware-agnostic performance measurement for gate-based quantum computers that considers a number of elements including the number of qubits, connectivity of the qubits, gate fidelity, cross talk, and circuit compiler efficiency. Recent quantum volume achievements include:
- Quantinuum’s Apollo system with a quantum volume over two million (as of 2024) [2]
- IBM’s published quantum volume of 512 (as of 2023)
- Honeywell’s System Model H1 achievements (2023-2024)
Rather than thinking about quantum supremacy as an absolute threshold or milestone, it is wiser to think about so-called quantum supremacy experiments as benchmarking experiments for the new technology, perhaps similar to the way we came to express automobile engine power in measures of horse power. There is also an intriguing question lingering over the whole concept of quantum supremacy, which is: “How could anyone know… that a quantum computer is genuinely doing something that is impossible for a classical one to do – rather than that they just haven’t yet found a classical algorithm that is clever enough to do the job?” [65] It may be that the advent of quantum computing will force and inspire new developments in classical computing algorithms, something we are already seeing in the concept of quantum-inspired computing software.
There is a second meaning one could attach to quantum supremacy, which is to mean which nation will hold the technological advantage to this technology of the future. The current investment landscape shows intense global competition:
China’s Quantum Investment
China announced a 1 trillion yuan ($138 billion) government-backed fund to support emerging technologies, including quantum computing, artificial intelligence, semiconductors, and renewable energy (March 2025) [20]. It’s important to note this fund covers multiple technologies, not just quantum computing. China’s quantum-specific public investment is estimated at approximately $15 billion [23].
Global Investment Comparison (as of early 2025):
- China: ~$15 billion specifically for quantum (part of larger $138B emerging tech fund)
- United States: ~$5 billion federal funding plus substantial private investment
- European Union: €1 billion Quantum Flagship program
- Australia: AU$893 million ($582 million USD) [22]
- UK and Japan: ~$300 million each
Table 2. Global Quantum Computing Investment Landscape (as of 2025)
| Region | Public Investment | Private Investment | Key Programs | Strategic Focus |
| China | ~$15B quantum-specific [23] | Declining (51k→1.2k startups) [24] | Part of $138B tech fund [20] | Hardware, quantum communications |
| United States | ~$5B federal | Strong VC ecosystem | NIST, NSF, DOE programs | Full stack, software, applications |
| European Union | €1B [69] | Moderate growth | Quantum Flagship | Theory, applications, standards |
| United Kingdom | ~$300M | Growing fintech focus | NQCC, UK Research and Innovation | Finance applications, cryptography |
| Australia | AU$893M ($582M USD) [22] | Silicon-focused | Silicon Quantum Computing | Hardware innovation |
| Canada | ~$360M CAD | Strong in startups | CIFAR, NRC programs | Software, quantum ML |
| Japan | ~$300M | Corporate-led | Moonshot R&D Program | Industrial applications |
| Singapore | ~$150M | Regional hub | Centre for Quantum Technologies | Research, talent development |
Competitive Assessment
In 2022, GlobalData said the U.S. was about five years ahead of China in the quantum computing race. Now, in 2024, the firm considers the two countries as “nearly equal” [21]. However, China’s venture capital investment has plummeted, with the number of startups dropping from over 51,000 in 2018 to just around 1,200 in 2023 [24], which could limit China’s private sector quantum innovation.
Competing Architectures
Quantum computers are very hard to build. They require intricate manipulations of subatomic particles, and operating in a vacuum environment or at cryogenic temperatures.
The state of quantum computing resembles the early days of the aircraft and automobile industries, when there was a similar proliferation of diverse architectures and exotic designs. Eventually, as quantum technology matures, a convergence can be expected similar to what we have seen in those industries. In fact, the arrival of such a technological convergence would be a good measure of a growing maturity of quantum computing technology.
There are a number of technical criteria [66] for making a good quantum computer:
- Qubit must stay coherent for long enough to allow the computing to be completed in the state of superposition. That requires isolation because decoherence occurs when qubits interact with the outside world
- Qubits must be highly connected. This occurs through entanglement and is needed for operations to act on multiple qubits
- High-fidelity operations are needed. As pointed out above, classical digital computers rely on the digital nature of signals for noise resistance. However, since qubits need to precisely represent numbers that are not just zero and one during the computation state, digital noise reduction is not possible and the noise problem is more analogous to that in an old-fashioned analog computer
- Gate operations must be fast. In practice, this is a trade-off between maintaining coherence and high-fidelity
- High scalability. It should be obvious that quantum computers will only be useful when they can be scaled large enough to solve valuable problems
Currently, the quantum technologies showing the greatest promise and attracting the most interest and investment dollars include superconducting qubits, trapped ions, neutral atoms, and quantum annealing. These and other technologies are presented in Table 3, along with the main proponents in each technology.
Table 3. Qubit Technologies and Main Proponents (as of early 2025)
| Technology | Main Proponents | Status |
| Superconducting qubits (called transmons by some) are realized by using a microwave signal to put a resistance-free current in a superposition state. Fast gate times, proven technology, but fast decoherence and requires near absolute zero temperatures. | IBM, Google, Rigetti, Alibaba, Intel, NEC, Quantum Circuits, Oxford Quantum Circuits | Production systems available |
| Quantum Annealing specialized for optimization problems, uses quantum tunneling to find minimum energy states. Not gate-based. | D-Wave | Production systems with 4,400+ qubits [25] |
| Ion Trap quantum computers work by trapping ions [67] electric fields. Longer coherence times but requires high vacuum and complex laser systems. | IonQ, Honeywell/Quantinuum, Alpine Quantum Technologies | Commercial systems available |
| Photonic qubits are photons operating on silicon chips. No extreme cooling needed, highly scalable using established fabrication. | PsiQuantum, Xanadu | Early commercial systems |
| Neutral atoms use laser “tweezers” to trap atoms. Achieving 99.1-99.5% fidelity [28]. | PASQAL (targeting 10,000 qubits by 2026) [29], QuEra, Atom Computing | Rapid development phase |
| Silicon qubits use electrons added to silicon, controlled with microwaves. Leverages semiconductor industry experience. | Intel, Silicon Quantum Computing | Research phase |
| Topological qubits would use exotic quasiparticles. Microsoft’s Majorana approach failed after 2021 paper retraction. | None currently | Theoretical only |
Technical Note: All qubit counts refer to physical qubits unless specified as logical qubits. Error rates vary significantly by operation type (single-qubit gates, two-qubit gates, readout) and measurement methodology. Performance metrics continue to evolve rapidly.
There is not yet a universally-accepted measure to compare the computing power of the different technologies. That is because obvious measures such as calculation cycles (qubit lifetime / gate operation time) is skewed by the current infidelities of gate operations, and the varying overheads imposed by error-correction schemes. For all gate-based technologies, clock speeds will also be limited for the foreseeable future due to the need for fault tolerance [68].
Industry Structure and Major Players
The quantum computing hardware covered above is of course only one layer of the quantum-computing stack. Immediately above the hardware is the systems layer, and on top of the systems layer are the software and applications layers. At the very top is the services layer, which is most commonly cloud-enabled these days [68].
Only a relatively small proportion of firms in the quantum computing ecosystem actually build working quantum computers, since that requires major resources and highly-specialized skills in quantum physics and hardware engineering. Many of the companies that self-identify as quantum-computing companies are actually on the software and services side. It is more common for providers of quantum hardware to move upwards in the ecosystem by adding software and services, than it is for software and services players to attempt to move downward by developing their own quantum computing hardware.
Hardware makers typically enable access to their quantum computers over the internet and through the cloud, often through subscription plans, sometimes for free. The cloud-based offering is typically a hybrid quantum-classical computing system, which breaks up the problem to be solved into parts that can be solved by conventional computers and parts that are best solved by the quantum computer. This situation resembles the early days of classical computing when only a few computers were available, these computers filled whole rooms, and they had to be shared among many users.
Hardware and Systems for Quantum Computing
The major competitors in the quantum computing hardware space each make their own quantum computers with competing architectures and specifications. The most significant of these general systems are made by the large companies IBM, Google, and Honeywell/Quantinuum, and the start-ups Rigetti, and IonQ. D-Wave makes and sells hybrid quantum computers that are specialized for quantum annealing and particularly well-suited to solving optimization problems, such as those of interest to the finance industry.
- D-Wave. [69] D-Wave is Canadian startup and a foremost proponent of quantum annealing, which includes optimization for finance applications. It is a pioneer in selling quantum computers to other organizations [70]. Its latest model, the Advantage2 quantum computer, has 4,400+ qubits with 20-way connectivity, improved coherence times, and fast anneal capabilities. The system uses only 12.5 kilowatts of electricity [25][26]. More than 20.6 million optimization problems have been run on Advantage2 prototypes since June 2022 (cumulative across prototype and production systems), with customer use up 134% in the last six months [26]. D-Wave reported fiscal year 2024 bookings exceeding $23 million and a 120% increase compared to fiscal year 2023, including its first on-premise system sale [27].
- Google. [71] Google (Alphabet Inc.) showed their strategic commitment to quantum computing with their fall 2019 announcement that their quantum computer had achieved so-called quantum supremacy, surpassing classical supercomputers at a particular task. In December 2024, Google unveiled its Willow processor, described as a significant achievement in quantum error correction [1][6]. Google has also developed neutral-atom quantum systems achieving 99.5% fidelity [3]. Google researchers have published extensively on quantum computing applications.
- Honeywell Quantum Solutions (now part of Quantinuum). [72] Quantinuum’s quantum systems deliver high performance across industry benchmarks, with the Apollo system achieving a quantum volume over two million (as of 2024) [2]. The company uses trapped-ion technology with unique mid-circuit measurement capabilities. JPMorgan Chase invested $100 million in Quantinuum (2024) [13], signaling major financial industry confidence.
- IBM Quantum. [73] IBM is one of few remaining manufacturers of classical mainframe computers. IBM’s roadmap shows the Kookaburra processor connecting three chips to create a 4,158-qubit system (as of 2025) [4]. IBM Quantum Heron can now accurately run certain classes of quantum circuits with up to 5,000 two-qubit gate operations [7]. IBM is intent on building a cloud-enabled ecosystem of quantum partners, actively promoting coding skills through annual challenges. JPMorgan Chase and Barclays were charter members of the IBM quantum computing network [74].
- IonQ. [75] IonQ is a startup which introduced the first commercial trapped-ion quantum computer. The company has a roadmap targeting broad quantum advantage by 2025, focusing on trapped ion technology [2]. IonQ uses “Algorithmic Qubits” (AQ) as its performance metric rather than raw qubit counts. The company has partnerships with AWS, Microsoft, and Google Cloud.
- Intel. Intel is developing both superconducting and silicon spin qubits. Intel Labs have developed “Horse Ridge,” a cryogenic control chip that operates near the qubits, reducing system complexity. Intel published a 2024 Nature paper on 300-mm spin qubit wafers, demonstrating uniformity and fidelity [2]. The company aims to leverage its semiconductor manufacturing expertise for quantum computing.
- PASQAL. PASQAL raised €100 million Series B funding and is on track to deliver a 1,000 qubit quantum computer by 2024 [30]. The company issued a roadmap for delivering systems with 10,000 physical qubits in 2026 and full fault-tolerance operation using 128-plus logical qubits in 2028 [29]. Founded by Nobel laureate Alain Aspect, PASQAL uses neutral atom technology with customers including BMW, BASF, Johnson & Johnson, and Siemens.
Additional Key Players in Quantum Hardware and Systems:
- QuEra Computing: Boston-based neutral atom company targeting 10,000 qubits
- Rigetti Computing: Berkeley-based startup with cloud-accessible superconducting systems
- PsiQuantum: Developing photonic quantum computers targeting 1 million qubits
- Xanadu: Canadian photonic quantum company with cloud-accessible systems
Software Platforms and Solutions to Enable Quantum Computing
The two main competing platforms in North America for general cloud-based quantum computing solutions are those from Microsoft and Amazon, with specialized platforms from companies like Multiverse focusing on financial applications.
- Amazon Braket. [77] Amazon Braket is a fully-managed cloud-based quantum computing service offering access to quantum hardware from D-Wave, IonQ, Rigetti, and others. Amazon’s Ocelot chip, launched in February 2025, reduces error correction costs by up to 90% [2]. The platform focuses on optimization, molecular simulation, and quantum machine learning.
- Microsoft Azure Quantum. [79] Microsoft’s Azure Quantum provides a full ecosystem of quantum solutions with hardware from multiple providers. Despite the failure of their topological qubit approach, Microsoft continues to advance quantum software and cloud services. The platform includes the Q# programming language and quantum development kit.
- Multiverse Computing. [80] This quantum computing startup with offices in Spain and Canada focuses specifically on quantum-computing software solutions for the financial industry. They offer quantum-inspired solutions including tensor networks and optimization algorithms that run on all quantum hardware platforms.
- 1QBit. [76] Vancouver-based startup providing hardware-agnostic quantum computing platforms with applications for optimization and financial modeling. Major investors include RBS and Allianz.
- Cambridge Quantum Computing (CQC). [78] Now part of Quantinuum, CQC builds architecture-agnostic quantum computing solutions focusing on chemistry, machine learning, cybersecurity, and finance.
- QC Ware. [81] Enterprise software startup with a large team of quantum algorithm experts, partnering with D-Wave, IBM, IonQ, and Rigetti for near-term quantum applications.
- Quantum Computing Inc (QCI). [82] Provides the Mukai platform for quantum-inspired and quantum-ready methods, with a Quantum Asset Allocator (QAA) specifically for portfolio optimization.
- IBM Qiskit. [58] Open-source quantum development framework that has become an industry standard for quantum circuit development.
Additional Software Providers of Quantum Solutions:
- Zapata Computing: Orquestra workflow platform for quantum applications
- Chicago Quantum: Specializes in quantum portfolio optimization algorithms
- Toshiba: Simulated Bifurcation Machine for quantum-inspired optimization
Academic and Other Research Centers for Quantum Computing
The U.S. National Institute of Standards and Technology (NIST) has been at the center of quantum computing research since the early 1990s. Partnerships between NIST and public universities have created research institutes such JILA [83] (with the University of Colorado Boulder) and the Joint Quantum Institute (JQI) [84]. According to a recent list compiled by the Quantum Daily [85], leading university-based quantum computing research organizations include:
- The Institute for Quantum Computing at the University of Waterloo
- Oxford Quantum at the University of Oxford
- The Harvard Quantum Initiative
- The Center for Theoretical Physics at MIT
- The Centre for Quantum Technologies at National University of Singapore
- The Berkeley Center for Quantum Information and Computation
- The Joint Quantum Institute at University of Maryland
- Division of Quantum Physics at University of Science and Technology of China
- The Chicago Quantum Exchange at University of Chicago
- The Quantum Science Group at University of Sydney
- Quantum Applications and Research Laboratory at LMU Munich
- Quantum Information & Computation at University of Innsbruck
Major Alliances in Quantum Computing
Quantum computing partnerships and alliances have become crucial for advancing the technology:
- IBM Quantum Network – 140+ participating organizations including Samsung, JPMorgan, Barclays, and Daimler
- Microsoft Quantum Network – Partners include Honeywell/Quantinuum, IonQ, Toshiba, 1QBit, and QCI
- D-Wave Partnerships – Collaborations with Multiverse, NEC, and numerous financial institutions
- NEASQC – European consortium with HSBC as first financial services member
- Financial Industry Quantum Consortiums – Informal groups of banks sharing quantum research
Potential First-Generation Applications of Quantum Computing
In advance of full-scale fault-tolerant quantum computing, there are several paths for near-term quantum applications in finance:
- Hybrid Quantum-Classical Computing: Breaking large portfolio problems into smaller subproblems that can be solved on current quantum hardware [8]
- Quantum-Inspired Algorithms: Classical algorithms redesigned using quantum principles
- Quantum Annealing: Already in production use for optimization problems [25]
A good practical definition to distinguish between classical and quantum computing is:
If a solution leverages the quantum mechanical principles of superposition and entanglement it can be called a quantum solution, or at least a hybrid classical/quantum solution. If the solution does not utilize these phenomena, we will call it a classical solution even though it may not look like a normal classical computing solution [86].
Current quantum-inspired implementations include:
- Microsoft’s quantum-inspired algorithms for optimization
- Quantum Computing Inc.’s Mukai platform for portfolio optimization
- Toshiba’s Simulated Bifurcation Machine
- Fujitsu’s Digital Annealer
Industries that Could Be Disrupted
A BCG analysis [68] from 2018 identified future use cases of quantum computing across five major sectors, with updates based on current progress:
- Finance: Portfolio optimization, risk analysis, fraud detection, derivatives pricing
- High-tech: AI/machine learning acceleration, cryptography, search optimization
- Industrial goods: Supply chain optimization, materials discovery, autonomous systems
- Chemistry/pharma: Drug discovery, molecular simulation, catalyst design
- Energy: Grid optimization, materials for batteries, geological modeling
A McKinsey analysis [87] of 100 use cases found Finance had the most applications (28 use cases), followed by global energy/materials (16) and advanced industries (11).
The financial industry is anticipated to become one of the earliest adopters of commercially useful quantum computing technologies [9], driven by:
- Computational intensity of current problems
- High value of even small improvements
- Digital nature allowing rapid deployment
- Competitive pressures for technological advantage
How Quantum Computing Could Change Finance
There are two ways to think about the influence of quantum computing on finance. The first, and most obvious, is that the special abilities of quantum computing will enable solving certain types of problems that even the most powerful classical computers cannot do in the time needed. Where companies currently run large-scale analytics computations for risk management, forecasting, planning, and optimization, quantum computing could change future operations and even strategy.
However, there is also a second, and perhaps more important way that quantum computing can influence finance and economics over time. And that is to change the way problems are shaped, structured, and modeled. We often forget that economics in its infancy as a social science was heavily influenced by the prevailing physics of its time, which was thermodynamics. The paradigms of partial and general equilibrium in economics were borrowed from thermodynamics. Much of the ongoing dissatisfaction with neo-classical economics theory comes from the fact that the real world does not seem to behave according to equilibrium models.
David Orrell has coined the term Quantum Economics in an eponymous book, where he expounds on his idea that quantum theory holds great promise as a new and better way of modeling financial markets:
“Perhaps the most useful contribution of quantum finance will be to change the way we think about the financial system. Instead of seeing stock prices as particles that are randomly jostled from their stable resting place by interactions with many independent investors, we begin to see them as fundamentally indeterminate quantities” [88].
This is an ambitious long-term vision for changing the very foundations of current economic models. However, this paper is primarily concerned with the near-term utility of quantum computing for the finance industry.
Computational Problems in Finance That Can Be Solved With Quantum Computing
BCG estimated that quantum computing could add $40-70 billion in operating income for financial services companies after the technology has sufficiently matured [89]. The three capabilities it is projected to revolutionize are:
- Optimization: Current optimizations have to use unrealistic assumptions to simplify scenarios. Quantum computers promise to solve problems with full real-world complexity
- Simulation and pricing: Monte Carlo simulations that take days could run in real time
- Machine learning: Overcoming computational limitations for complex algorithms
IBM Quantum confirms these three capability categories with specific applications:
- Portfolio optimization and diversification (optimization)
- Option pricing and portfolio risks (simulation)
- Credit scoring and fraud detection (machine learning)
According to McKinsey, four capital markets industry archetypes will benefit: sellers, buyers, matchmakers (trading platforms), and rule setters. Quantitatively-driven hedge funds and large banks are natural early adopters.
Specific finance applications of quantum computing by vertical (according to Multiverse) [90]:
- Capital markets:
- Portfolio optimization
- Optimal investment/divestment trajectories
- Trend/anomaly detection
- Market crash predictions
- Credit and risk:
- Automated credit scoring
- Loan portfolio supervision
- Asset-liability matching optimization
- Capital allocation optimization
- Fraud detection:
- Credit card fraud
- Money transfer fraud
- Anti-money laundering
- Tax fraud detection
Trading, Portfolio and Other Optimization Applications of Quantum Computing
Quantitative investors are hoping that quantum computing will solve computational problems in portfolio optimization, arbitrage strategy, and trading cost minimization. Classical computers encounter problems with complex computing loads when adding realistic assumptions:
Adding noncontinuous, nonconvex functions such as interest rate yield curves, trading lots, buy-in thresholds, and transaction costs to investment models makes the optimization surface so complex that classical optimizers often crash, simply take too long to compute, or mistake a local optimum for the global optimum [91].
The main quantum approaches for portfolio problems are:
- Algorithms based on quantum annealing (most mature)
- Gate-based quantum algorithms (developing)
- Quantum-inspired models using tensor networks (currently available)
All approaches require translating real-world problems into polynomial unconstrained binary optimization (PUBO) expressions. Hybrid quantum-classical approaches break large portfolio problems into smaller subproblems that current quantum hardware can handle [8].
D-Wave’s quantum annealing technology is particularly suited for optimization, with production systems already processing millions of customer problems [25][26]. Quantum annealing naturally finds minimum energy states corresponding to optimal solutions.
Table 4. Quantum Algorithms and Their Finance Applications
| Algorithm Type | Finance Application | Classical Complexity | Quantum Speedup | Readiness |
| Quantum Annealing | Portfolio optimization, asset allocation | NP-hard | Polynomial for specific cases | Production [25] |
| Grover’s Algorithm | Database search, fraud detection, AML | O(N) | O(√N) | Near-term |
| VQE/QAOA | Risk analysis, stress testing | Exponential | Polynomial approximation | Development |
| Quantum ML | Credit scoring, pattern recognition | O(N²) | O(N log N) potential | Research |
| HHL Algorithm | Linear systems, pricing models | O(N³) | O(log N) theoretical | Research |
| Quantum Monte Carlo | Derivatives pricing, VaR | O(1/ε²) | O(1/ε) quadratic speedup | Development |
| Shor’s Algorithm | Cryptanalysis threat | Exponential | Polynomial | Future (requires 1000s of logical qubits) |
Notable Case Studies and POCs of Quantum Computing in Finance
The landscape of quantum computing in finance has shifted dramatically from experiments to production deployments:
Quantum Computing Production and Near-Production Deployments (2024-2025)
- JPMorgan Chase – Industry Leader:
- Invested $100 million in quantum computing company Quantinuum (2024) [13]
- Published research on quantum computational supremacy for certified randomness generation with partners including Quantinuum, Argonne National Laboratory, and Oak Ridge National Laboratory (March 2025) [16]
- Developed quantum algorithms for portfolio optimization, option pricing, risk analysis, and machine learning [14]
- Established quantum-secured crypto-agile network for production environments [18]
- Pioneered hybrid quantum-classical approach with AWS and Caltech breaking large portfolio problems into subproblems [17]
- Goldman Sachs – Options Pricing Innovation:
- Collaborated with Quantum Motion to develop efficient algorithms for options pricing, demonstrating speedups requiring many qubits operating simultaneously (November 2024) [15]
- Assembled full team dedicated to quantum computing focusing on simulation, optimization, and machine learning [12]
- Working with QC Ware and IonQ for derivatives pricing applications
- HSBC – Quantum Security Pioneer:
- First bank to pilot quantum cryptography for tokenized gold trading platform [11]
- Named among top three quantum innovators in finance alongside JPMorgan and Goldman Sachs (February 2025) [19]
- Member of European NEASQC quantum consortium
- Industry-Wide Adoption:
- Wells Fargo developed nearly a dozen quantum algorithms with IBM [13]
- Mastercard working with D-Wave to optimize loyalty rewards programs [13]
- BNP Paribas invested in quantum hardware startups; Axa invested in post-quantum security [17]
- Citi partnered with Classiq for portfolio optimization exploration [17]
Earlier Financial Quantum Computing Case Studies (2019-2023) Still Relevant:
- Barclays worked with IBM on quantum algorithms for securities transaction settlement, publishing research on how small qubit systems could handle complex settlement optimization.
- BBVA following six research lines with partners including Spain’s CSIC, Accenture, Fujitsu, Zapata, and Multiverse. Early results show quantum can solve portfolio optimization “quickly, accurately, and efficiently.”
- Commonwealth Bank of Australia with Rigetti tested quantum approximate optimization algorithm (QAOA) for portfolio rebalancing, achieving results within 5% of optimal.
- NatWest Bank uses Fujitsu’s quantum-inspired Digital Annealer for £120bn portfolio optimization, achieving 300x speed improvement with higher accuracy.
Table 5. Leading Financial Institutions in Quantum Computing (as of 2025)
| Institution | Investment | Focus Areas | Key Partners | Status |
| JPMorgan Chase | $100M+ [13] | Portfolio optimization, cryptography, randomness generation | Quantinuum, IBM, AWS | Production |
| Goldman Sachs | Undisclosed | Options pricing, derivatives, simulation | Quantum Motion, QC Ware, IonQ | Advanced pilots |
| HSBC | Undisclosed | Quantum security, tokenized gold platform | NEASQC consortium | Early production |
| Wells Fargo | Undisclosed | Multiple quantum algorithms | IBM | Development |
| Barclays | Undisclosed | Settlement optimization | IBM | Research |
| Mastercard | Undisclosed | Loyalty program optimization | D-Wave | Pilot phase |
| BNP Paribas | Undisclosed | Hardware investments | Various startups | Investment stage |
| Citi | Undisclosed | Portfolio optimization | Classiq | Exploration |
Post-Quantum Cryptography Implementation
The quantum threat to current encryption has moved from theoretical to urgent. On August 13, 2024, NIST released three finalized post-quantum encryption standards after eight years of development [31]:
The NIST Quantum Encryption Standards:
- FIPS 203 (ML-KEM): Primary standard for general encryption based on CRYSTALS-Kyber
- FIPS 204 (ML-DSA): Primary standard for digital signatures based on CRYSTALS-Dilithium
- FIPS 205 (SLH-DSA): Hash-based digital signature standard based on SPHINCS+
NIST is encouraging computer system administrators to begin transitioning to the new standards as soon as possible [31].
Financial Sector Response to Post-Quantum Cryptography
Financial services firms’ spending on quantum-resistant cryptography is outpacing offensive quantum use cases [10] due to:
- “Harvest now, decrypt later” threats
- Regulatory compliance expectations
- Long migration timelines for critical systems
Early adopters of post-quantum cryptography include:
- Apple, Signal, and Zoom already implementing post-quantum encryption [32]
- Major banks beginning cryptographic inventories
- Financial infrastructure providers updating standards
Implementation Timeline of post-quantum cryptography
- 2024: Standards released, early adopters begin
- 2025-2026: Expected regulatory mandates
- 2027-2028: Industry-wide migration targets
- 2030: Full post-quantum security expected
What to Expect Next for Quantum Computing
Milestones Reached and What They Portend For Quantum Computing’s Future
Recent quantum computing milestones signal accelerating progress:
2024-2025 Achievements:
- Google’s Willow processor advances in error correction (December 2024) [1]
- IBM’s 4,158-qubit modular systems demonstrated (2025) [4]
- D-Wave’s Advantage2 system with 4,400+ qubits in production (May 2025) [25]
- NIST post-quantum cryptography standards finalized (August 2024) [31]
- JPMorgan’s quantum research published (March 2025) [16]
Key Progress Indicators:
- Error rates approaching practical thresholds for specific applications
- Hybrid algorithms demonstrating business value
- Cloud accessibility democratizing quantum access
- Major financial institutions moving beyond POCs to pilots
John Preskill recently suggested changing from NISQ versus FASQ terminology to Mega/Gigaquop (million/billion quantum operations) systems [5], reflecting the shift from research to practical metrics.
Plausible Timeframe for Partial vs. Full Scale Quantum Computing Solutions
Updated Industry Roadmaps:
- IBM:
- 2023: Achieved 127-qubit Eagle processor
- 2025: Demonstrated 4,158-qubit systems through modular connections [4]
- 2025: Families of pre-built runtimes for finance applications
- 2030: Million-qubit vision
- Quantinuum:
- Current: Quantum volume over two million achieved (as of 2024) [2]
- 2025-2030: Path to fault-tolerant systems
- PASQAL:
- 2024: 1,000-qubit neutral atom systems [30]
- 2026: 10,000 physical qubits targeted [29]
- 2028: 128+ logical qubits with fault tolerance [29]
- D-Wave:
- Current: 4,400+ qubit Advantage2 in production (May 2025) [25]
- Near-term: Enhanced hybrid solvers for larger problems
First and Second Wave Applications of Quantum Computing in Finance
Table 6. Projected Impact of Quantum Computing on Finance over Time (as of early 2025)
| Stages: | First Wave (Now-2027) | Second Wave (2027-2032) | Third Wave (2032+) |
| Technology Status | NISQ era continues. 4,400+ qubit annealers operational [25]. Error mitigation reducing costs 90% [3]. Hybrid classical-quantum standard. Cloud access ubiquitous. | Logical qubit systems emerging. 10,000+ physical qubits [29]. Partial error correction. Quantum networking developing. | Full fault tolerance achieved. Million+ qubit systems. Universal quantum computers. Quantum internet operational. |
| Finance Applications | Hybrid portfolio optimization in production [8]. Options pricing pilots [15]. Post-quantum cryptography migration [31]. Quantum-inspired algorithms widespread. | Real-time portfolio rebalancing. Intraday risk calculations. Quantum ML for fraud detection. Complex derivative pricing. | Whole-market simulations. Quantum-native products. Real-time global risk modeling. Revolutionary new financial instruments. |
| Industry Impact | Major banks investing $100M+ [13]. First-mover advantages emerging. Competitive differentiation beginning. Talent wars intensifying. | Quantum capability becomes mandatory. $5-20B revenue impacts. Mid-tier bank adoption. Regulatory frameworks established. | $40-70B operating income impact. Complete industry transformation. Quantum-first strategies. New business models. |
| Adoption Level | Early adopters (10-20% of major banks). Focus on optimization. Defensive cryptography upgrades. | Mainstream adoption (50%+). Production deployments common. Industry standards mature. | Universal adoption. Quantum-native generation. Classical methods obsolete for many tasks. |
First Wave Quantum Computing Adoption in Finance (Now-2027)
- Technology Status: NISQ era continues. 4,400+ qubit annealers operational [25]. Error mitigation reducing costs 90% [3]. Hybrid classical-quantum standard. Cloud access ubiquitous.
- Finance Applications: Hybrid portfolio optimization in production [8]. Options pricing pilots [15]. Post-quantum cryptography migration [31]. Quantum-inspired algorithms widespread.
- Industry Impact: Major banks investing $100M+ [13]. First-mover advantages emerging. Competitive differentiation beginning. Talent wars intensifying.
- Adoption Level: Early adopters (10-20% of major banks). Focus on optimization. Defensive cryptography upgrades.
Second Wave Quantum Computing Adoption in Finance (2027-2032)
- Technology Status: Logical qubit systems emerging. 10,000+ physical qubits [29]. Partial error correction. Quantum networking developing.
- Finance Applications: Real-time portfolio rebalancing. Intraday risk calculations. Quantum ML for fraud detection. Complex derivative pricing.
- Industry Impact: Quantum capability becomes mandatory. $5-20B revenue impacts. Mid-tier bank adoption. Regulatory frameworks established.
- Adoption Level: Mainstream adoption (50%+). Production deployments common. Industry standards mature.
Third Wave Quantum Computing Adoption in Finance (2032+)
- Technology Status: Full fault tolerance achieved. Million+ qubit systems. Universal quantum computers. Quantum internet operational.
- Finance Applications: Whole-market simulations. Quantum-native products. Real-time global risk modeling. Revolutionary new financial instruments.
- Industry Impact: $40-70B operating income impact. Complete industry transformation. Quantum-first strategies. New business models.
- Adoption Level: Universal adoption. Quantum-native generation. Classical methods obsolete for many tasks.
Conclusions and Recommendations
The quantum computing landscape has transformed dramatically since 2021-2022. Quantum computing is turning an important corner, with systems able to perform tasks beyond the reach of classical systems [5]. However, we remain in the NISQ era (Noisy Intermediate-Scale Quantum), with full fault-tolerant quantum computing still years away.
Key Quantum Computing Developments Since 2021
The quantum computing landscape has undergone transformative changes since 2021, marking a critical inflection point for financial institutions considering quantum adoption strategies. These developments span hardware advances, error correction breakthroughs, hybrid computing methodologies, cryptographic security requirements, and shifting global competitive dynamics. Financial services organizations must understand these quantum computing developments to make informed strategic decisions about technology investments, security protocols, and competitive positioning in an increasingly quantum-enabled marketplace.
- Hardware Progress: From hundreds to thousands of qubits, with production systems processing millions of real optimization problems [25]
- Error Mitigation: 90% cost reductions in error correction [3] making larger calculations feasible
- Hybrid Approaches: Practical methods for breaking large problems into quantum-solvable pieces [8]
- Cryptographic Urgency: NIST standards released, requiring immediate action [31]
- Global Competition: U.S. and China now “nearly equal” versus 5-year U.S. lead in 2022 [21]
Strategic Quantum Computing Recommendations for Financial Institutions
Financial institutions must implement a phased quantum computing strategy that balances immediate security requirements with long-term competitive advantages through quantum-enabled capabilities. This strategic approach encompasses immediate post-quantum cryptography migration, near-term pilot programs for quantum optimization applications, and medium-term production deployments that deliver measurable business value. Organizations that execute comprehensive quantum strategies while competitors delay adoption will secure lasting advantages in operational efficiency, risk management, and innovative financial products.
Immediate Actions (0-6 months):
- Begin Post-Quantum Migration: NIST standards are final – delay increases risk [31]
- Deploy Quantum-Inspired Solutions: Available today on classical hardware
- Establish Quantum Team: Talent scarcity will worsen – hire now
- Start Cloud Experiments: Low-cost access to multiple platforms
Near-Term Strategy (6-18 months):
- Pilot Hybrid Applications: Focus on portfolio optimization using quantum annealing
- Partner with Quantum Providers: Follow JPMorgan’s $100M investment model [13]
- Develop Use Case Pipeline: Identify high-value optimization problems
- Complete Cryptographic Inventory: Prepare for full PQC migration
Medium-Term Goals (18-36 months):
- Production Deployments: Move beyond pilots to operational systems
- Competitive Differentiation: Develop proprietary quantum applications
- Talent Development: Build internal quantum expertise
- Industry Leadership: Join quantum consortiums and standards bodies
Risk Assessment
Quantum computing adoption presents a complex risk-reward equation for financial institutions, requiring careful evaluation of both delayed adoption risks and premature investment challenges. Organizations must balance the competitive disadvantages of waiting against the uncertainties of early quantum technology investments, while simultaneously addressing immediate cryptographic security vulnerabilities. This quantum risk assessment framework helps financial services leaders make informed decisions about quantum computing timelines, investment levels, and strategic priorities in an evolving technological landscape.
Risks of Waiting:
- Permanent talent disadvantage as experts are hired
- Competitors gaining operational experience
- Missing patent and IP opportunities
- Cryptographic vulnerabilities to “harvest now, decrypt later”
Risks of Early Investment:
- Technology immaturity and changing standards
- High initial costs with uncertain ROI
- Potential for hype-driven poor decisions
- Resource allocation challenges
Table 7. Quantum Computing Risk-Opportunity Matrix for Financial Services
| Timeline | Risks of Waiting | Risks of Acting | Opportunities | Recommended Action |
| Immediate (0-6 months) | • Cryptographic vulnerabilities to “harvest now, decrypt later” attack • Loss of top quantum talent to competitors • Missing foundational IP development | • High initial costs ($1-5M) • Technology immaturity • Uncertain ROI | • First-mover advantage in optimization • Establish talent pipeline • Shape industry standards | • Begin PQC migration [31] • Deploy quantum-inspired solutions • Establish quantum team |
| Near-term (6-18 months) | • Competitive disadvantage as peers advance • Missed patent opportunities • Regulatory compliance gaps | • Pilot project failures • Vendor lock-in risks • Skills gap challenges | • Successful pilot deployments • Proprietary algorithm development • Strategic partnerships | • Launch hybrid quantum pilots • Partner with providers [13] • Develop use case pipeline |
| Medium-term (18-36 months) | • Permanent capability gap • Market share loss • Technology obsolescence | • Major technology pivots • Integration complexity • Scaling challenges | • Market leadership position • New product development • Operational advantages | • Scale to production • Build proprietary capabilities • Lead industry consortiums |
| Long-term (3+ years) | • Business model disruption • Inability to compete • Stranded assets | • Over-investment risk • Technology commoditization • Regulatory constraints | • Industry transformation • Quantum-native services • New revenue streams | • Full quantum integration • Develop quantum strategy • Transform business model |
Final Recommendations
The evidence strongly supports immediate engagement with quantum computing:
- Start with Low-Risk Approaches: Quantum-inspired algorithms provide immediate value
- Focus on Optimization: Proven applications in portfolio management [25]
- Prioritize Security: Post-quantum cryptography is non-negotiable [31]
- Build Gradually: From experiments to pilots to production
- Think Long-Term: Quantum advantage will emerge gradually then suddenly
Finance is anticipated to become one of the earliest adopters of commercially useful quantum computing [9]. The transition from theoretical possibility to practical reality is accelerating. Financial institutions that begin their quantum journey now will be positioned to capitalize on the technology’s transformative potential. Those that delay face growing competitive disadvantage as the quantum era unfolds.
The question is no longer whether quantum computing will impact finance, but how quickly institutions can adapt to harness its power. The time for action is now.
Glossary of Key Terms
- Algorithmic Qubits (AQ): A metric introduced by IonQ representing the number of “useful” encoded qubits in a quantum computer, calculated as log base 2 of quantum volume.
- Coherence Time: The duration a qubit can maintain its quantum state before decoherence occurs.
- Decoherence: The loss of quantum properties due to environmental interference, causing qubits to lose their superposition and entanglement.
- Entanglement: Quantum phenomenon where qubits become correlated, with the state of one instantly affecting others regardless of distance.
- Fault Tolerance Threshold: The error rate below which quantum error correction can effectively maintain quantum information indefinitely, typically around 0.1% for most error correction codes.
- Gate-based Quantum Computing: Quantum computing using quantum logic gates to manipulate qubits, analogous to classical digital logic gates.
- Hybrid Quantum-Classical Computing: Approach combining quantum and classical processors, where quantum handles specific hard subproblems while classical manages the overall computation.
- Logical Qubit: An error-corrected qubit created from multiple physical qubits to achieve fault-tolerant quantum computing.
- NISQ (Noisy Intermediate-Scale Quantum): Current era of quantum computing characterized by systems with 50-1000 qubits that have significant error rates and no error correction.
- Physical Qubit: The actual hardware implementation of a quantum bit, subject to errors and decoherence.
- Post-Quantum Cryptography (PQC): Encryption methods designed to be secure against both classical and quantum computer attacks.
- Quantum Advantage: When quantum computers provide practical business value over classical computers for real-world problems.
- Quantum Annealing: A quantum computing approach specialized for optimization problems, finding minimum energy states without using gates. Different from gate-based quantum computing.
- Quantum Supremacy: Demonstrated ability of a quantum computer to solve any problem (even contrived) faster than the best classical supercomputer.
- Quantum Volume (QV): IBM’s hardware-agnostic metric considering qubits, connectivity, gate fidelity, and other factors to measure quantum computer performance.
- Qubit: Quantum bit, the fundamental unit of quantum information existing in superposition of |0⟩ and |1⟩ states.
- Superposition: Quantum property allowing qubits to exist in multiple states simultaneously until measured.
References
[1] “The Quantum Era has Already Begun,” TIME, accessed May 2025.
[2] “Quantum Computing Roadmaps: A Look at The Maps And Predictions of Major Quantum Players,” The Quantum Insider, May 16, 2025.
[3] “10 Key Quantum Computing Breakthroughs in 2025,” Trailyn.com, March 6, 2025.
[4] “IBM roadmap to quantum-centric supercomputers (Updated 2024),” IBM Quantum Computing Blog, accessed May 2025.
[5] “Quantum Computing 2025 — Is it Turning the Corner?,” HPCwire, January 1, 2025.
[6] “Google’s Quantum Computing Technology in 2024 Review,” The Quantum Insider, accessed May 2025.
[7] “IBM Launches Its Most Advanced Quantum Computers, Fueling New Scientific Value and Progress towards Quantum Advantage,” IBM Newsroom, November 13, 2024.
[8] “Quantum Computing for Portfolio Optimization and Risk Analysis: Transformative Approaches and Practical Frameworks in Financial Services,” ResearchGate, 2024.
[9] “Quantum computing’s six most important trends for 2025,” Moody’s, accessed May 2025.
[10] “Industry spending on quantum computing will rise dramatically. Will it pay off?,” Deloitte, 2023.
[11] “Preparing for a quantum future: what’s next for quantum computing in financial services?,” FinTech Futures, December 2024.
[12] “How Goldman Sachs and JPMorgan are using quantum computing,” eFinancialCareers, December 2020.
[13] “Quantum leap: JPMorgan Chase, Wells Fargo push past laggards,” American Banker, September 26, 2024.
[14] “Applied Research,” JPMorgan, accessed May 2025.
[15] “Quantum Motion And Goldman Sachs Identify Quantum Applications in Financial Services Project,” The Quantum Insider, November 4, 2024.
[16] “JPMorganChase, Quantinuum, Argonne National Laboratory, Oak Ridge National Laboratory and University of Texas at Austin advance the application of quantum computing to potential real-world use cases beyond the capabilities of classical computing,” JPMorgan press release, March 26, 2025.
[17] “Quantum Technology Use Cases in Finance & Banking,” PostQuantum, accessed May 2025.
[18] “JPMorgan Chase establishes quantum-secured crypto-agile network,” JPMorgan, accessed May 2025.
[19] “JP Morgan, HSBC and Goldman Sachs named top quantum innovators in finance,” FStech, February 2025.
[20] “China Launches $138 Billion Government-Backed Venture Fund, Includes Quantum Startups,” The Quantum Insider, March 7, 2025.
[21] “China invests billions in quantum computing, race with US now neck-and-neck,” SDxCentral, February 2024.
[22] “Quantum Initiatives Worldwide 2025,” Qureca, accessed May 2025.
[23] “China’s long view on quantum tech has the US and EU playing catch-up,” Merics, accessed May 2025.
[24] “China’s Quantum Ambitions May Face Headwinds From Weak Startup Ecosystem,” The Quantum Insider, September 14, 2024.
[25] “D-Wave Announces General Availability of Advantage2 Quantum Computer,” The Quantum Insider, May 20, 2025.
[26] “D-Wave Announces General Availability of Advantage2 Quantum Computer, Its Most Advanced and Performant System,” D-Wave press release, May 20, 2025.
[27] “D-Wave Announces 2024 Bookings and First On-Premise Advantage System Sale,” The Quantum Insider, January 10, 2025.
[28] “Unlocking the Potential of Quantum Computing: the Neutral Atoms perspective,” Inside Quantum Technology, accessed May 2025.
[29] “PASQAL Issues Roadmap to 10,000 Qubits in 2026 and Fault Tolerance in 2028,” HPCwire, March 20, 2024.
[30] “Pasqal raises €100 Million Series B funding to advance Neutral Atoms Quantum Computing,” Pasqal press release, February 11, 2025.
[31] “NIST Releases First 3 Finalized Post-Quantum Encryption Standards,” NIST, August 13, 2024.
[32] “NIST’s Post-Quantum Cybersecurity Standards Ready for Enterprise Use,” PYMNTS.com, 2024.
[33] All processing is done via logic gates (AND, OR, NOT, XOR i.e. exclusive OR), which compare two or more bits at a time.
[34] The Antikythera mechanism, an ancient Greek hand-powered orrery, dated as far back as 100-200 BC is considered to be the earliest analog computer.
[35] However, the physics of modern semiconductors intrinsically rely on quantum effects, and it is the understanding of those quantum effects that enabled the inventors of the transistor to create the device in the first place.
[36] Einstein famously called this phenomenon “spooky action at a distance.”
[37] For more details, see Table 1 in the subsequent section.
[38] Dietz, Miklos, Nico Henke, Jared Moon, Jens Backes, Lorenzo Pautasso, and Zaheen Sadeque, “How quantum computing could change financial services,” McKinsey & Company, Dec. 2020.
[39] Complex algebra uses an expanded number system where there are both real and imaginary numbers. A complex number is a number of the form a + bi, where a and b are real numbers, and i is the imaginary unit number defined as √-1.
[40] A vector that points to a specific point in space which corresponds to a particular quantum state.
[41] Levy, Max G., “New Quantum Algorithms Finally Crack Nonlinear Equations,” Quanta Magazine, Jan. 5, 2021.
[42] Deutsch, David. “Quantum theory, the Church–Turing principle and the universal quantum computer.” Proceedings of the Royal Society of London. A. Mathematical and Physical Sciences 400, no. 1818, pp. 97-117, 1985.
[43] Shor, Peter W, “Algorithms for quantum computation: discrete logarithms and factoring.” In Proceedings 35th annual symposium on foundations of computer science, pp. 124-134. IEEE, 1994.
[44] For a more complete discussion of the mathematics, see “Shor’s algorithm,” IBM Quantum, https://quantum-computing.ibm.com/docs/iqx/guide/shors-algorithm.
[45] In computer science, the order of sophistication and problem difficulty are as follows: P (easy), NP (medium hard), NP-complete (hard), and NP-hard (hardest). The P = NP problem is currently the subject of a $1m Millennium Prize for a correct solution.
[46] Grover, L.K., “A fast quantum mechanical algorithm for database search,” Proceedings of the Twenty-Eighth Annual ACM Symposium on Theory of Computing – STOC ’96, 1996, pp. 212–219.
[47] Y. Li, M. Tian, G. Liu, C. Peng and L. Jiao, “Quantum Optimization and Quantum Learning: A Survey,” in IEEE Access, vol. 8, pp. 23568-23593, 2020.
[48] https://quantumalgorithmzoo.org/
[49] The term “annealing” in computer science is borrowed from metallurgy, where it refers to thermal annealing, i.e. the heat treatment that increases metal ductility while reducing hardness.
[50] See for example, “What is Quantum Annealing?,” D-Wave System Documentation, https://docs.dwavesys.com/docs/latest/c_gs_2.html.
[51] The Traveling Salesman Problem (TSP) is an NP-hard problem in combinatorial optimization.
[52] McMahon, Peter, “To Crack the Toughest Optimization Problems, Just Add Lasers,” IEEE Spectrum, Nov. 27, 2018.
[53] Burkacky, Ondrej, Niko Mohr, and Lorenzo Pautasso, “Will quantum computing drive the automotive future?,” McKinsey & Company, Sep. 2, 2020.
[54] Pakin, Scott, “The Problem with Quantum Computers,” Scientific American, Jun. 10, 2019.
[55] Dyakonov, Mikhail, “The Case Against Quantum Computing,” IEEE Spectrum, Nov. 15, 1998.
[56] See, for example, the Comments section below the original article at https://spectrum.ieee.org/computing/hardware/the-case-against-quantum-computing.
[57] Versluis, Richard, “Here’s a Blueprint for a Practical Quantum Computer,” IEEE Spectrum, Mar. 24, 2020.
[58] The QC equivalent of an SDK.
[59] Ibaraki, Stephen, “What You Need for Your Quantum Computing Pilots In 2021,” Forbes, Jan. 29, 2021.
[60] https://www.fujitsu.com/us/services/business-services/digital-annealer/services/
[61] Wolchover, Natalie, “Classical computing embraces quantum ideas,” Quanta Magazine, Dec. 18, 2012.
[62] Martinis, John, “Quantum Supremacy Using a Programmable Superconducting Processor,” Google AI Blog, Oct. 23, 2019.
[63] Arute, Frank, Kunal Arya, Ryan Babbush, Dave Bacon, Joseph C. Bardin, Rami Barends, Rupak Biswas et al. “Quantum supremacy using a programmable superconducting processor.” Nature 574, no. 7779 (2019): 505-510.
[64] Preskill, John. “Quantum computing and the entanglement frontier.” arXiv preprint arXiv:1203.5813 (2012).
[65] Haughton, Richard, “Race for quantum supremacy hits theoretical quagmire,” Nature, Nov. 14, 2017.
[66] See “TQD Exclusive: A Detailed Review of Qubit Implementations for Quantum Computing,” The Quantum Daily, May 21, 2020.
[67] Ions are electrically-charged atoms meaning they have more or less electrons than protons.
[68] Ruess, Frankl, “The Next Decade in Quantum Computing—and How to Play,” BCG, Nov. 15, 2018.
[69] https://www.dwavesys.com/
[70] Customers include Lockheed Martin, Google, NASA, the University of Southern California, and Los Alamos National Laboratory.
[71] https://research.google/research-areas/quantum-computing/
[72] https://www.honeywell.com/us/en/company/quantum
[73] https://quantum-computing.ibm.com/
[74] Crosman, Penny, “JPMorgan Chase, Barclays join IBM quantum computing network,” American Banker Vol. 182 Issue 240, Dec. 15, 2017.
[75] https://ionq.com/
[76] https://1qbit.com/
[77] https://aws.amazon.com/braket/
[78] https://cambridgequantum.com/
[79] https://azure.microsoft.com/en-us/solutions/quantum-computing/
[80] https://www.multiversecomputing.com/
[81] https://qcware.com/
[82] https://quantumcomputinginc.com
[83] https://jila.colorado.edu/
[84] https://jqi.umd.edu/
[85] Matt Swayne, “The World’s Top 12 Quantum Computing Research Universities,” The Quantum Daily, Nov. 28, 2019.
[86] “What is This Quantum-Inspired Stuff All About?,” Quantum Computing Report.
[87] Ménard, Alexandre, Ivan Ostojic, Mark Patel, and Daniel Volz, “A game plan for quantum computing,” McKinsey & Company, Feb. 6, 2020.
[88] Orrell, David. Quantum Economics (p. 172). Icon Books Ltd. Kindle Edition.
[89] Bobier, Jean-François, Jean-Michel Binefa, Matt Langione, and Amit Kumar, “It’s Time for Financial Institutions to Place Their Quantum Bets,” BCG, Oct. 16, 2020.
[90] Mugel, Samuel, Quantum Computing for Finance, Jan 29, 2021.
[91] Langione, Matt, Corban Tillemann-Dick, Amit Kumar, and Vikas Taneja, “Where Will Quantum Computers Create Value—and When?,” BCG, May 13, 2019.
[92] Note: Unless otherwise specified, all qubit counts in this paper refer to physical qubits rather than logical qubits.

Joseph Byrum is an accomplished executive leader, innovator, and cross-domain strategist with a proven track record of success across multiple industries.
