The D‑Wave system is used for computationally intensive tasks such as optimization, cyber security, machine learning, and sampling. These types of problems can be extremely hard to solve, with potentially enormous benefits if optimal solutions can be readily computed.
Optimization problems are some of the most complex computing problems, and they underly some of the most important applications of interest to government. Examples include systems design, mission planning, scheduling, and machine learning.
One of the best known examples of a complex optimization problem is the Traveling Salesman Problem (TSP), where, given a list of cities and distances between them, the task is to find the shortest route such that all cities are visited once and the visitor ends in the starting city. This famous optimization problem is known to be NP-hard. D-Wave quantum computers natively solve a binary optimization problem, by formulating them as “quadratic unconstrained binary optimization”, or QUBO, problems.
When you look at a photograph it is very easy for you to pick out the different objects in the image: cars, trees, mountains, etc. This task is almost effortless for humans, but is in fact a hugely difficult task for computers to achieve. This is because programmers don’t know how to define the essence of a ‘car’ in computer code.
Machine learning is the most successful approach to solving this problem, by which programmers write algorithms that automatically learn to recognize the “essences” of objects by detecting recurring patterns in huge amounts of data. Because of the amount of data involved in this process and the immense number of potential combinations of data elements, this is a very computationally expensive optimization problem.
As with other optimization problems, these can be mapped to the native ability of the D‑Wave quantum processing unit, which finds solutions to these problems by drawing samples from a probability distribution similar to a Boltzmann distribution. This allows the D‑Wave system to be used to construct powerful machine learning algorithms based on probabilistic frameworks that can be applied to data analysis.
Many things in the world are uncertain, and governed by the rules of probability. We have, in our heads, a model of how things will turn out in the future, and the better our model is, the better we are at predicting the future. We can also build computer models to try and capture the statistics of reality. These tend to be very complicated, involving a huge number of variables.
In order to check to see if a computer’s statistical model represents reality we need to be able to draw samples from it, and check that the statistics of our model match the statistics of real world data. Monte Carlo simulation, which relies on repeated random sampling to approximate the probability of certain outcomes, is an approach used in many industries such as finance, energy, manufacturing, engineering oil & gas and the environment. For a complex model, with many different variables, this is a difficult task to do quickly.
Cyber security is a huge issue for governments and enterprises alike, and entirely new strategies are needed to combat ever-increasing threat levels. Quantum computing will likely play an important part in providing the next generation of cyber security including threat identification, identification of cyber adversaries, and securing the global networks we all depend on.