This page exists on your local site.

Go there now
Stay here
X
Coworkers working on a laptop in the office

Quantum AI

What it is and why it matters

Quantum AI combines the power of quantum computing with artificial intelligence. This integration plays on the unique strengths of quantum and AI, using quantum bits, known as qubits, to perform advanced computations that classical computers can’t handle.

History of quantum AI

The concept of quantum computing emerged in the early 1980s when physicist Richard Feynman proposed the idea of using quantum mechanics to simulate physical systems, which classical computers could not do. This idea laid the foundation for quantum computing, which uses the principles of quantum mechanics, including superposition and entanglement, to perform complex computations.

In the 1990s, the development of quantum algorithms, such as Shor's algorithm for factoring large numbers, demonstrated the potential of quantum computing to solve problems faster than classical computers. These advancements spurred interest in exploring the intersection of quantum computing and AI.

The early 2000s saw the establishment of the Quantum Artificial Intelligence Lab by NASA, Google and the Universities Space Research Association. This initiative aimed to pioneer research on how quantum computing could enhance machine learning and other complex computational tasks.

Around the same time, researchers began developing quantum machine learning algorithms, which leverage quantum computing to improve the speed and accuracy of AI models.

In recent years, the focus has shifted toward practical applications of quantum AI.

Companies at the forefront of this research are exploring hybrid architectures that combine quantum and classical computing. For instance, current research investigates the use of quantum annealing for optimization problems and the gate model for more universal applications like machine learning, quantum chemistry and simulation.

Quantum AI in today’s world

Two men working together on a computer

Quantum computing and AI

Quantum computing could redefine data analysis and model training in AI. Learn how we bridge today’s quantum reality with the visions of the future.

Three coworkers in office using laptop computers while talking

Growing up quantum in an AI world

For Bill Wisotsky, Principal Quantum Systems Architect, quantum computing is an idea that took root in his mind and wouldn’t let go. Read how his decades-long fascination led him to study hybrid quantum computing today.

Woman looking at a laptop

 SAS defines hybrid reality for quantum computing

We’re at a point where quantum computing could redefine data analysis and model training in AI. Read more about it in this Forbes article.

Quantum AI explained

Learn about quantum computing and how it works from Amy Stout, Head of Quantum AI Product Strategy at SAS. What is a qubit? And how does quantum computing differ from classical computing? Stout provides clear answers and explains where we're headed with this emerging technology. 

Who's using quantum AI

Quantum AI has the potential to revolutionize industries by providing unprecedented computational power and efficiency.

Consider these industries where quantum computing could make a significant impact:

Health care

Quantum can transform health care by simulating complex biological systems and accelerating drug discovery. For instance, quantum computers can model molecular interactions at an atomic level, which is crucial for understanding diseases and developing new medications. This capability allows researchers to identify potential drug candidates more quickly and accurately, reducing the time and cost associated with bringing drugs to market.

Banking

In the financial services industry, quantum can be used to optimize investment portfolios, manage risk and detect fraud. Quantum algorithms can process extremely complex financial data in unique ways and identify patterns that traditional computers might miss. This allows financial institutions to develop more effective trading strategies and improve their risk management practices. Quantum computing is also used to enhance cryptographic methods, ensuring more secure transactions.

Logistics and supply chain management

Quantum can improve logistics and supply chains by optimizing routing and scheduling through its ability to search an entire solution space at once, finding multiple high-quality solutions. For example, quantum algorithms can determine the most efficient routes for delivery trucks, minimizing fuel consumption and reducing delivery times. In warehouse management, quantum can improve inventory management and reduce operational costs.

Insurance

Insurers depend on data with highly complex relationships to accurately predict losses, price policies and customize offers. AI and quantum computing can refine risk assessments for insurers by analyzing these complex relationships simultaneously. For example, quantum AI could speed up the analysis of rapidly evolving risks like weather patterns and their impact on pricing and affordability trends. To take advantage of this new technology, insurers will need to collaborate with various stakeholders. 

The quantum market is showing a lot of progression. It’s a $35 billion market, projected to reach a trillion by 2030. So, you can imagine what might happen in the next few years – the leaps we’ll make in this will be huge. Bryan Harris Executive Vice President and Chief Technology Officer SAS

How quantum AI works

Quantum computers are different from any existing classical computer, including smartphones and even the most powerful supercomputers. They take advantage of the unique properties of quantum mechanics, such as superposition and entanglement, to help solve certain classes of complex problems that are too challenging for classical computers to solve alone. In some cases, they can solve the problem significantly faster, and in other cases, they can represent the problem in ways that conventional computers cannot.

For now, quantum computers will not replace conventional computers but work alongside them as another tool. Under this paradigm, CPUs, GPUs and QPUs will work together to address the pieces of the problem for which they are best suited.

Classical computers use bits to represent data as either 0 or 1. However, quantum computers use quantum bits, or qubits, which can exist in multiple states simultaneously. The principle of superposition describes the existence of multiple states and can be illustrated with the following analogy:

Consider a coin. There are two clear states the coin can exist in, heads or tails, which can be thought of as the zero and one state of a classical bit. Now imagine the coin is spinning in the air. In this case, the heads and tails states exist together with an equal probability of measuring either state once the coin stops. Quantum computing can use this simultaneous nature by performing calculations on both heads (0) and tails (1) at the same time, as long as the coin remains spinning (in a state of superposition). 

Stack of coins

This state of superposition allows quantum computers to process double the amount of information in a single qubit vs. a single bit. As you increase the number of qubits, the amount of information that can be processed increases exponentially as 2number of qubits, significantly speeding up computations. For example, 10 qubits can perform the calculations equivalent to 1,024 classical bits, and this grows exponentially.

Next, let's learn about entanglement and quantum algorithms:

Entanglement

Another equally important quantum physical property used in quantum computing is entanglement. You can simply think of entanglement as quantum particles being correlated. When two qubits are entangled, if you know the state of one, then you automatically know the state of the other. Entanglement, when combined with superposition, can further increase computational power.

Quantum algorithms

Quantum AI also uses quantum algorithms to improve machine learning models. Quantum machine learning algorithms, such as quantum support vector machines and quantum neural networks, use quantum circuits to perform computations.

These quantum circuits represent a universal method of performing quantum computations. 

For instance, in a common implementation of a quantum neural network, classical data is encoded into quantum states. The quantum circuit uses parameterized rotations, entanglements and measurements to examine complex relationships simultaneously. The output is classically optimized and fed back into the circuit as new parameterized rotations, repeating until an optimal configuration is derived. This is similar to optimizing node weights in a classical neural network.

Quantum AI – a hybrid approach

Since quantum computing technology is still maturing, quantum AI is a hybrid process involving a combination of quantum and classical computing approaches. In some cases, quantum processing happens first, in other cases it happens last, and sometimes there is a cyclical approach between quantum and classical computing. This hybrid nature uses the strengths of both quantum and classical computing to achieve better performance and accuracy.

As quantum computers evolve, we will continue to see hybrid approaches that use the increasing reliability and scalability of quantum computers to enhance AI-assisted decisioning.

We are at the dawn of the integration between quantum computing and AI. This integration will become tighter as quantum computing matures. Currently, quantum computing manufacturers are experimenting with co-locating QPUs (quantum processing units) and specialized AIUs (AI units).

As this type of research evolves over the next 5 to 10 years, expect to see tremendous technological quantum-AI gains. These advances will change our existing methodologies and open doors to help solve complex problems in new and unique ways.

Next steps

Learn more about AI solutions from SAS, including generative AI, trustworthy AI – and more.

A data and AI platform

With SAS® Viya®, there’s no such thing as too much information. Learn about the quickest way to get from a billion points of data to a point of view.

SAS Model Manager models page screenshot