Quantum Optimization: The Future of Operations Research

Published:

|

Updated:

Featured Image on Quantum optimization: The future of operations research

Published on INFORMS Analytics Magazine (Joseph Byrum)

The critical tools analysts rely upon to optimize business processes tend to be general-purpose items. The calculator, personal computer, spreadsheet and software packages like R are indispensable, but they were never designed with operations research (O.R., i.e., predictive and prescriptive analytics) in mind. That could change with the next great leap in optimization technology: quantum computing.

The traveling salesman problem is the classic puzzle that O.R. is designed to solve. Given a list of destinations, find the shortest possible route that hits each location once and returns to the origin. The permutations of possible itineraries grow with the number of cities, and once you get over about 40 or 50 cities, even the planet’s fastest supercomputer can’t brute force the answer within a human lifespan. So, the job of O.R. is not to come up with the absolute best theoretical result, but to come up with analytical shortcuts that produce the practical result of a travel time measurably reduced compared to a baseline. Quantum computing, on the other hand, promises to deliver the ability to solve a large traveling salesman problem in a fraction of a second.

Quantum Computing Basics

It seems too good to be true, but it’s all made possible by the bizarre interactions of particles at the subatomic level. Nothing about this is simple or easy. Richard Feynman, the theoretical physicist who landed a Nobel Prize for his work in quantum mechanics [1] once told his Cornell students that nobody actually understands what’s going on with the smallest objects in the universe [2], but that doesn’t mean you can’t measure particle behavior and harness those movements to produce useful results as we use the passage of electrical current through logic gates in a conventional computer chip.

Quantum particles have two signature properties – superposition and entanglement – that are being exploited to create a new way of thinking about data. Conventional computers assign values of 0 or 1 to a bit, while the basic unit of a quantum computer is the “qubit,” which can exist in multiple states (superposition) during processing. Thanks to entanglement, the qubits also interact with one another before coming to rest in a binary state when measurement takes place.

The advantage qubits have over conventional bits is exponential scaling. Ten qubits utilizing entanglement could be in a superposition of 1,024 (= 210) basis states. That means when more processing power is needed, just add more qubits to obtain the same amount of information as 2n bits in a conventional computer. Once you hit about 300 qubits, you’re looking at more bits than there are atoms in the universe – quite a useful property when you want to process the near-infinite combinations of variables that O.R. deals with on a regular basis.

A 300-qubit machine doesn’t exist – yet. That’s because subatomic particles are fantastically difficult to harness. They must be shielded from interference of every kind including magnetism, radio waves, vibration and temperature. Just looking at a subatomic particle influences its behavior, and thus the outcome of processing. So, qubits don’t just need to be kept cool with a small fan or water-cooling loop like an ordinary PC, they must be so separated from heat sources that the measured temperature, 0.15K, is just above absolute zero – colder than the depths of interstellar space.

IBM, D-Wave and others have succeeded in overcoming these hurdles to develop primitive quantum hardware that harnesses the power of a handful of qubits. These have established that the technology is real, and the results are indeed coming from the interactions of subatomic particles. Quantum computers come in a number of forms, but the most useful to O.R. and analytics professionals is known as a “quantum annealer.” Operators of these systems convert real-world problems into a polynomial unconstrained binary optimization (PUBO) expression of the sort that should be familiar to anyone who has tackled the traveling salesman problem.

The quantum computer runs a program that interacts with the quantum computing chip to create an “energy landscape” with the equation’s variables. The subatomic particles interact with one another, existing in superposition and entanglement, while seeking to come to a rest at their minimum state. At the end of the annealing process, the result should be the lowest- or near-lowest-energy solution to the problem. The amazing part seems to be that, no matter how complicated the problem, run times are measured in milliseconds. This could open up a new frontier of real-time optimization of highly complex problems [3].

The tools needed to conduct such optimization experiments are available right now for free or at a very low cost. The only catch is that quantum systems are crude devices with limited processing power – for now. IBM offers public cloud-based access to both real quantum machines and simulators through its “Quantum Experience” [4]. Both Amazon and D-Wave offer limited access to their quantum machines for free, allowing anyone to test code developed using open-source Python libraries or freely available quantum computing frameworks like Qiskit.

There are also quantum computing simulators that allow for testing and refinement of code that can be used to set up optimization solutions that may be beyond the capabilities of early quantum hardware, but they’ll be ready for use when the number of available qubits grows. Given the technical challenges involved in increasing quantum hardware power, that could take a while. There’s no way to know whether developments will take years or more than a decade, but this is where the industry is headed. And that’s good for O.R.

O.R.’s Advantage in the Quantum Future

While it might seem at first that powerful quantum computers are going to solve every difficult problem and send O.R. practitioners to the unemployment line, the opposite is actually the case. The hardest part of coming up with a quantum optimization solution isn’t coding for a quantum computer, it’s framing the challenge in a way the machine can understand. That’s exactly the area in which operations research professionals excel. Expect demand to grow for talent with experience in framing optimization problems in the years ahead. Those interested in the potential of quantum optimization ought to begin experimenting now.

References

  1. https://www.caltech.edu/about/news/feynmans-nobel-year-48524
  2. https://youtu.be/Ja0HSFj8Imc?t=483
  3. See for example, “What is Quantum Annealing?” D-Wave System Documentation, https://docs.dwavesys.com/docs/latest/c_gs_2.html
  4. https://quantum-computing.ibm.com/
Scroll to Top