In June 2019, Clabby Analytics published a blog entitled: “A New Inflection Point in Quantum Computing: IBM’s Evolving Quantum Development Environment,” found here. In this blog, we argued that, with the evolution of an open source quantum computing development environment, quantum computing has now moved beyond an experimental stage into a new, commercially viable stage.
We based our opinion on beta code that provided visual insights into coding for a quantum computer; a highly integrated cloud-based Qiskit development environment; Jupyter Notebook facilities that maintain system configurations and help track progress; and graphical tools that provide deep insights into system behavior and circuit design. With all this design and integration work, we were left with the distinct impression that IBM – and, indeed, the quantum computing industry – had turned a corner in quantum computing. At this juncture, we became quantum computing believers.
Other research and analysis firms, however, are not as optimistic as we are about quantum computing. They see it as a long-range endeavor – still highly experimental – with few, practical, real world use cases in place. We think these firms are missing the point: quantum computing is growing like wildfire. There are currently more than one-hundred and sixty-five thousand registered users signed up to experiment with IBM’s cloud-based Q Experience. And over thirty million quantum calculations have been performed to date.
It should also be noted that an entire ecosystem has grown-up around IBM’s quantum computing foray. Large corporations, including JP Morgan Chase & Company, ExxonMobil, Samsung, Daimler, JSR Corporation, Accenture, US Air Force Research Lab have become industry partners with IBM – helping drive development of quantum computing solutions. Research hubs include Keio University, Oak Ridge National Laboratory, NC State University, The University of Melbourne, University of Oxford, University of Bundeswehr Munich, National Taiwan University, the Iberian Nanotechnology Laboratory, and CSIC Spain. Member organizations include Barclays, Mizuho, Argonne Lab, Berkeley Lab, Honda, Hitachi Metals, and more. And nineteen startups have become part of the effort to create quantum computing solutions, including QC Ware, Grid, Zapata, Q-CTRL, Quantum Benchmark, Netramark, Entropica Labs, Boxcat and Rahko. Furthermore, IBM has signed agreements with twenty-six academic partners, including MIT, Virginia Tech, Notre Dame, Harvard, Duke, Northwestern, NYU, U of Turku, U. of Innsbruk, ETH Zurich, Saarland University – all helping drive quantum skill creation around the world. With all this activity, the momentum to drive quantum computing appears to be strong.
Computing Power
In a recent briefing, IBM provided Clabby Analytics with a detailed roadmap by industry of use cases and estimated timeframes regarding quantum computing solutions. Our most important finding regarding these use cases is that the areas that quantum computing can serve in the chemicals and petroleum, distribution and logistics, financial services, healthcare, and life sciences and manufacturing industries are well known – but the roll-out of quantum solutions is being throttled by quantum computing capacity. In other words, the types of problems that quantum computing can solve are already well-known – but much more computing capacity (and better noise reduction combined with fault tolerance and better qubits) are needed to tackle larger problems. In short, quantum computing at present is “hardware constrained.”
As more capacity is added to IBM Q systems, more quantum solutions can come on-line. Note: IBM Q quantum computers are currently doubling in compute capacity every year.
At present, IBM’s largest quantum computer is 53 qubits, which is enough computing power to run hundreds of different algorithms – but not powerful enough to run some very advanced use cases that require much more computing power to handle larger problems. So, it could take a decade or more for some of the more advanced quantum use cases to come to fruition.
At 53 qubits, IBM’s quantum computers are still too small, but they are progressing along a path to solve problems in the chemical and petroleum industries related to surfactants, catalysts, and chemical product design. In the distribution and logistics industries, quantum computing can potentially help solve problems related to vehicle routing and network optimization. In financial services, quantum computers may eventually help solve problems related to credit/asset scoring, transaction settlement, and portfolio management. In healthcare and life sciences, quantum computers may aid in drug discovery, in licensing, and in protein structure prediction. In manufacturing, quantum systems might help solve problems related to quantum chemistry and materials discovery.
But again: much more computing power (Quantum Volume) is needed to tackle problems of any great scale. “Quantum Volume” is a measure of the overall power of a circuit-based quantum computing system. It involves the number of qubits but many other factors that influence just how “good” those qubits are. (For an overview of quantum computers, see this blog: https://ibm.co/2EAJEbx.)
As Quantum Volume increases, expect to see advances in chemical simulation, scenario simulation, optimization, and artificial intelligence/machine learning (AI/ML) will start to occur. After this, the next phase of use cases in chemicals and petroleum will include advances in oil shipping/trucking, refining processes, feedstock to product and drilling locations. Advances in distribution and logistics will include freight forecasting, irregular behaviors (Ops), disruption management, and distribution supply chain improvements. In financial services, quantum computers will bring advances in derivatives pricing, irregular behavior analysis (fraud analysis), and investment risk analysis. In healthcare and life sciences, advances can be expected through accelerated diagnosis, in clinical trial enhancements, in genomic analysis, and in medical/drug supply chain activities. Finally, in manufacturing, quantum computing will aid in quality control, process planning, manufacturing supply chain activities, and in fabrication optimization.
A third phase driven by increases in system power and the introduction of fault tolerance, when thousands of qubits can be reliably used, may bring advances in seismic imaging, in consumer offer recommendations, in financial offer recommendations, in disease risk predictions and in structural design and fluid dynamics.
Proof-Point Algorithms
The early experimentation in quantum computing has helped create four “algorithm families,” groups of algorithms that showcase quantum computing strengths. So far, the strengths of quantum computing have been in the areas of:
• Chemical simulation (simulating physical processes that can be found in nature). Quantum computers can help in the process of molecular design – enabling researchers and engineers make chemicals or materials for given purposes.
• Scenario simulations (measuring a range of different outcomes). This quantum function can help in risk management by measuring volatility on outcomes, in pricing by aiding in the evaluation of asset values for trades, and in marketing to monitor impacts on economic systems.
• Optimization (seeking to find optimal paths). This function is useful in routing, in supply chain management, in portfolio management, and in operations – helping to improve productivity while making better use of resources. Note, however, that quantum computing cannot solve the famous “Traveling Salesman Problem.”
• AI/ML (useful for finding relations in data and building assumptions.) Using this family of algorithms may help improve prediction of future events, classification category handling, and pattern analysis (such as finding anomalies in data.)
Currently, some of the more important algorithms that can be run on IBM Q systems include:
• VQE (Variation Quantum Eigensolver) – an algorithm that optimizes compute-intensive functions. This algorithm more efficiently calculates complex portions of simulations than traditional computing environments. For small data sets, this is an excellent proof-point of the optimization characteristics of quantum computing. But, again, the qubit number increases significantly with problem size – so more powerful systems are needed to better exploit this algorithm.
• QAOA (Quantum Approximate Optimization) – an algorithm that helps optimize combinatorial style problems to find solutions with complex constraints. This algorithm simplifies analysis clauses for constraints – leading to significant streamlining and optimization advantages over traditional computing environments.
• QAE (Quantum Amplitude Estimator) – helps create simulation scenarios by estimating unknown properties. This algorithm handles random distributions, instead of only sampling data. By being able to solve dynamic problems quadratically, QAE dramatically speeds-up simulation performance.
• QVSM (Quantum Support Vector Machines) – helps supervised machine learning perform analysis on high dimensional problem sets – meaning it maps data to large dimensions to enable separation. By doing this using quantum computing, machine learning applications can better separate data to be classified and achieve greater accuracy.
• HHL (the Harrow, Hassidim, and Lloyd) algorithm helps estimate the resulting measurement of large linear systems, enabling quantum computers to solve high dimensional problems better than traditional systems. This algorithm should lead to an exponential speedup in some matrix calculations under the right conditions.
Summary Observations
As we stated in our original blog (here), IBM’s quantum computing goal is “to reach ‘Quantum Advantage,’ a state where specific business and science problems are solved using quantum system circuits that cannot be replicated using classical computing systems, thus delivering a unique advantage.”
To reach this goal and achieve a significant improvement over classical systems, the power of quantum computers, as measured by Quantum Volume, must continue to double every year. Quantum Volume is the fundamental performance metric that measures progress in the pursuit of Quantum Advantage, the point at which quantum applications deliver a significant practical benefit beyond what classical computers alone are capable. Right now, Quantum Volume is the limiting factor of quantum computing – not a lack of use cases.
At present, quantum algorithms are showing promising results on tests to speed up calculations, optimize processes, and increase accuracy. The initial set of problems that quantum computing may tackle are seemingly well known. Use cases for quantum computing, and a roadmap, are evolving quickly. The ecosystem is getting ready and able to exploit quantum computing as systems increase in power.
To say that quantum computing is decades off – too far in the future to even think about – would be a mistake. Important large firms are already experimenting with quantum computing; business partners are in place to work on industry quantum solutions; governments and universities are involved in quantum experimentation – and schools and universities are already teaching quantum computing fundamentals. Quantum computing is real – and is becoming commercially viable.
From where we stand, there is considerable momentum behind quantum computing – and that momentum (which combines the thoughts and efforts of thousands of individuals – not just one company) leads us to believe that important Quantum Advantage solutions, rolled out as hybrid classical /quantum solutions, may arrive a lot sooner than most other research analysts predict.