The Future of Quantum in the Cloud
Five years ago, IBM was the first to make a quantum computer available over the cloud. The approach made sense instantly. These are rare, expensive machines and incredibly temperamental. It didn’t seem wise for potential users to consider investing in owning one. IBM kept going with the approach – today, anyone can access about half of IBM’s 20 quantum machines for free on its expanded cloud environment. But not every system manufacturer wants to create a cloud, and not every potential quantum customer wants to access different clouds for each task.
Microsoft and Amazon kicked off a new movement by creating cloud environments that let users access quantum computers from multiple hardware providers. In the case of Microsoft Azure Quantum, users can run code on real quantum systems from Honeywell, IonQ, and Quantum Circuits, Inc. What are the benefits of this approach? What can we use these machines for in the near future and beyond?
Cloud quantum development
These cloud services are much more than portals to advanced quantum machines. Microsoft Azure Quantum and Amazon Braket are complete ecosystems with reusable code and APIs that allow for interfacing with standard or classical applications. The cloud giants can concentrate on providing these rich cloud development platforms and secure, easy access. In turn, companies like Honeywell and IonQ can focus on what they do best: building and improving their quantum computers.
Classic computers are not going away, and quantum computers will not exist in a vacuum. Expect classic and quantum computing to live for the foreseeable future as a hybrid pair, with classic systems passing tasks on to quantum ones whenever the latter can provide a measurable edge and cost-savings benefit. Therefore, it makes sense for users to access quantum computers in the same cloud environment where they perform classical workloads. This way, developers can create applications where classic and quantum hardware interact on the back end seamlessly.
Such a usage approach is not that different from today’s tiered cloud infrastructure. For example, systems with expensive GPUs are only used in the cloud when they can assist with tasks such as machine learning.
Using all this power today – and considering future options for growth and real-world applications
What are companies using these systems for in the cloud? Current quantum computers lack enough quantum bits, or qubits, to rival classic computers just yet. However, one approach that promises to bring potential advantage in the near term is optimization. Experiments in portfolio optimization have caught the attention of leading financial services firms, which already are making significant investments in this space. For example, with such a hybrid approach in financial services, classic computers can parse tasks to a type of quantum computer called an annealer, which then identifies profitable trades. The classic system then pieces the results together.
Beyond financial services, other optimization problems this hybrid approach can tackle include finding the best shipping routes, such as for delivery trucks.
Anywhere we use machine learning today, we can expect to see some quantum edge in the future. Financial services firms are also interested in improved fraud detection and credit scoring, for example. Another major use case is really two different types of simulation. Monte Carlo risk simulations show the promise for a quadratic speedup on quantum. Then there are simulations of the real subatomic world, such as materials science simulating molecules.
Cloud environments will have to mature how to share precious quantum resources efficiently for such real-time tasks to be feasible. For now, there are still queues, which means the systems are only appropriate for internal proof of concept projects rather than 24/7 production tasks.
Companies are hiring consultants and in-house developers to work on custom use cases, both to take advantage of the machines we have today and prepare for the quantum systems we expect to see within as little as two or three years. The buzz factor is high, and getting research money for quantum hardware and software projects seems to be effortless.
All this is stimulating innovation in a way we haven’t seen since the early buzz around the development of the Internet. We thought we knew some practical uses for the Internet in the 1990s, but we couldn’t have predicted what has happened since. Quantum will be the same, perhaps with the best use cases still undreamt of.
The quantum roadmap
Quantum computing hardware is already becoming somewhat of an arms race, with most manufacturers making ambitious claims. ColdQuanta promises 100 qubits this year. IBM plans on far exceeding that number soon, with 1,100 qubits expected by 2023. Honeywell came out of the gate with claims of annual tenfold improvement in power. IonQ promises to reach similar performance to IBM and introduce the concept of interconnect, where systems can perform calculations together to appear as one larger quantum computer.
Qubits are subject to interference or noise, and we still have the hardware challenge of error correction. Depending on which approach vendors use, they claim thousands of physical qubits will be needed to handle error correction and provide a perfect “logical” qubit. As an extreme opposite, Honeywell claims only three qubits can provide error correction for an additional four qubits to become logical ones. We still don’t know how error correction will play out.
Classic computers can simulate quantum computers, but only up to about 50 qubits. Quantum computers already are starting to pass that barrier, which means we’re almost to the point where quantum computers will seemingly have an unchallenged advantage over classic systems. And because a quantum computer doubles in power with every added qubit, classic will never be able to catch up when surpassed at a particular task.
What will such a quantum advantage mean to mainstream computing? For starters, look for exponential performance improvements and likely cost and energy savings. Imagine a 30-hour optimization problem solved in three minutes, and you’ll get a general idea. In the future, machine learning, in particular, could finally enjoy an explosion in power that AI researchers have been dreaming of seeing.
FREE Membership Required to View Full Content:
Joining MSDynamicsWorld.com gives you free, unlimited access to news, analysis, white papers, case studies, product brochures, and more. You can also receive periodic email newsletters with the latest relevant articles and content updates.
Learn more about us here