Google announces TensorFlow Quantum, an open source library for Quantum machine learning

On its official AI blog, Google announced the launch of TensorFlow Quantum (TFQ), an open-source quantum machine learning library that combines quantum computing with machine learning to train quantum models. Google says that this quantum machine learning model can process quantum data and can be executed on a quantum computer.

According to the Google AI Blog, TFQ allows researchers to construct quantum datasets, quantum models, and classical control parameters into tensors in a single computational graph. TensorFlow Ops obtains quantum measurements that lead to classical probabilistic events, which can then be trained using standard Keras functions.

To build and train such a model, the general steps are as follows:

  1. Prepare a quantum dataset – Quantum data is loaded as tensors (a multi-dimensional array of numbers). Each quantum data tensor is specified as a quantum circuit written in Cirq that generates quantum data on the fly. The tensor is executed by TensorFlow on the quantum computer to generate a quantum dataset.
  2. Evaluate a quantum neural network model – The researcher can prototype a quantum neural network using Cirq that they will later embed inside of a TensorFlow compute graph. Parameterized quantum models can be selected from several broad categories based on knowledge of the quantum data’s structure. The goal of the model is to perform quantum processing in order to extract information hidden in a typically entangled state. In other words, the quantum model essentially disentangles the input quantum data, leaving the hidden information encoded in classical correlations, thus making it accessible to local measurements and classical post-processing.
  3. Sample or Average – Measurement of quantum states extracts classical information in the form of samples from a classical random variable. The distribution of values from this random variable generally depends on the quantum state itself and on the measured observable. As many variational algorithms depend on mean values of measurements, also known as expectation values, TFQ provides methods for averaging over several runs involving steps (1) and (2).
  4. Evaluate a classical neural networks model – Once classical information has been extracted, it is in a format amenable to further classical post-processing. As the extracted information may still be encoded in classical correlations between measured expectations, classical deep neural networks can be applied to distill such correlations.
  5. Evaluate Cost Function – Given the results of classical post-processing, a cost function is evaluated. This could be based on how accurately the model performs the classification task if the quantum data was labeled, or other criteria if the task is unsupervised.
  6. Evaluate Gradients & Update Parameters – After evaluating the cost function, the free parameters in the pipeline should be updated in a direction expected to decrease the cost. This is most commonly performed via gradient descent.