Author: Johann Maximilian Zollner, [email protected]
Quantum machine learning is an emerging research field and hybrid classic-quantum systems utilize existing quantum hardware for machine learning. Pre- and post-processing run on classic hardware to overcome the limitations of today’s quantum computers in terms of capacity and length of sequence of operations [1]. Due to noise in quantum hardware only a limited number of quantum bits (qubits) and operations (gates) are realizable. However, quantum computing is promising since it may speed-up algorithms and outperform classic approaches in terms of computation time.
In this small tutorial, we will use a hybrid system consisting of a convolutional autoencoder model and an entangled quantum circuit to perform binary classification of satellite imagery. The autoencoder reduces the elements of the input images to only 16 values. The quantum circuit will be simulated on a classic CPU. Pennylane is used as framework for quantum machine learning in combination with Tensorflow.
Note: This script is made to run even on weaker CPUs and thus batch sizes and number of epochs are small. However, you may modify those and set them as you like.
import multiprocessing
import numpy as np
import matplotlib.pyplot as plt
import pennylane as qml
import tensorflow as tf
# Optional seed for reproducability of results. You may change or delete this for your own experiments
np.random.seed(42)
tf.random.set_seed(42)
# Latent dimemsion and number of data qubits num. You may experiment with different values for num but the classifier with num+1 qubits already takes about one hour to train
num = 16
We use the NWPU-RESISC45 [2] dataset which can be downloaded here. The dataset has rich image variations and high within class dicersity and between class similarity. Thus, it is a far more suffisticated task compared to often used datasets for image classification like the famous MNIST.
For binary classifcation you can chose any classes you like, here we have taken beach and harbor.
# Define path for the dataset, here we extracted it into the same directory as the notebook
path = './NWPU-RESISC45'
# Keras data generator
datagen = tf.keras.preprocessing.image.ImageDataGenerator(rescale=1/255, validation_split=0.2)
train_gen = datagen.flow_from_directory(
path,
target_size=(256, 256),
color_mode='rgb',
classes=['beach','harbor'],
class_mode='binary',
batch_size=1120,
shuffle=True,
subset='training'
)
test_gen = datagen.flow_from_directory(
path,
target_size=(256, 256),
color_mode='rgb',
classes=['beach','harbor'],
class_mode='binary',
batch_size=280,
shuffle=True,
subset='validation'
)
# Create training and test data
x_train, y_train = next(train_gen)
x_test, y_test = next(test_gen)
Found 1120 images belonging to 2 classes.
Found 280 images belonging to 2 classes.
f = plt.figure(figsize = (10, 10))
axarr = f.subplots(2,2)
f.tight_layout(pad=2.0)
i1 = np.random.randint(len(x_train))
i2 = np.random.randint(len(x_train))
i3 = np.random.randint(len(x_train))
i4 = np.random.randint(len(x_train))
_ = axarr[0,0].imshow(x_train[i1])
_ = axarr[0,0].set_title(y_train[i1])
_ = axarr[0,1].imshow(x_train[i2])
_ = axarr[0,1].set_title(y_train[i2])
_ = axarr[1,0].imshow(x_train[i3])
_ = axarr[1,0].set_title(y_train[i3])
_ = axarr[1,1].imshow(x_train[i4])
_ = axarr[1,1].set_title(y_train[i4])
plt.show()
The preprocessing of the data is basically a dimensionality reduction. We use a convolutional autoencoder model to reduce the input images from 256x256x3 to 4x4x1 values.
# The convolutional autoencoder architecture
class ConvAutoencoder(tf.keras.models.Model):
def __init__(self, latent_dim):
super(ConvAutoencoder, self).__init__()
self.latent_dim = latent_dim
self.encoder = tf.keras.Sequential([
tf.keras.layers.Conv2D(input_shape=(256, 256, 3), filters=64, kernel_size=3,
padding='same', name='enc'),
tf.keras.layers.LeakyReLU(alpha=0.3),
tf.keras.layers.MaxPooling2D((4, 4)),
tf.keras.layers.Conv2D(32, 3, padding='same'),
tf.keras.layers.BatchNormalization(),
tf.keras.layers.LeakyReLU(alpha=0.3),
tf.keras.layers.MaxPooling2D((4, 4)),
tf.keras.layers.Conv2D(16, 3, padding='same'),
tf.keras.layers.BatchNormalization(),
tf.keras.layers.LeakyReLU(alpha=0.3),
tf.keras.layers.MaxPooling2D((2, 2)),
tf.keras.layers.Conv2D(1, 1, padding='same'),
tf.keras.layers.BatchNormalization(),
tf.keras.layers.LeakyReLU(alpha=0.3),
tf.keras.layers.MaxPooling2D((2, 2)),
tf.keras.layers.Flatten(),
tf.keras.layers.Dense((latent_dim))
])
self.decoder = tf.keras.Sequential([
tf.keras.layers.Dense((latent_dim)),
tf.keras.layers.Reshape((4, 4, 1)),
tf.keras.layers.Conv2D(input_shape=(4, 4, 1), filters=1, kernel_size=1, padding='same'),
tf.keras.layers.BatchNormalization(),
tf.keras.layers.LeakyReLU(alpha=0.3),
tf.keras.layers.UpSampling2D((2, 2)),
tf.keras.layers.Conv2D(16, 3, padding='same'),
tf.keras.layers.BatchNormalization(),
tf.keras.layers.LeakyReLU(alpha=0.3),
tf.keras.layers.UpSampling2D((2, 2)),
tf.keras.layers.Conv2D(32, 3, padding='same'),
tf.keras.layers.LeakyReLU(alpha=0.3),
tf.keras.layers.UpSampling2D((4, 4)),
tf.keras.layers.Conv2D(64, 3, padding='same'),
tf.keras.layers.LeakyReLU(alpha=0.3),
tf.keras.layers.UpSampling2D((4, 4)),
tf.keras.layers.Conv2D(3, 3, padding='same')
])
def call(self, x):
encoded = self.encoder(x)
decoded = self.decoder(encoded)
return decoded
# Create the autoencoder model
ae = ConvAutoencoder(num)
ae.compile(optimizer='adam', loss=tf.keras.losses.MeanSquaredError())
Here we train the autoencoder model over 5 epochs to save computation time. A higher number of epochs may improve the result.
ae.fit(
x_train,
x_train,
epochs=5,
shuffle=False,
validation_data=(x_test, x_test),
workers=multiprocessing.cpu_count()
)
Epoch 1/5
35/35 [==============================] - 83s 2s/step - loss: 0.0627 - val_loss: 0.1070
Epoch 2/5
35/35 [==============================] - 81s 2s/step - loss: 0.0393 - val_loss: 0.0804
Epoch 3/5
35/35 [==============================] - 81s 2s/step - loss: 0.0360 - val_loss: 0.0693
Epoch 4/5
35/35 [==============================] - 81s 2s/step - loss: 0.0343 - val_loss: 0.0629
Epoch 5/5
35/35 [==============================] - 81s 2s/step - loss: 0.0331 - val_loss: 0.0592
<keras.callbacks.History at 0x7f3efc8b69d0>
The quantum circuit is based on [3]. The architecture provides high entangled quantum states and has already shown promising results in previous works [1].
# Define layers for quantum circuit
# Every two qubit gate is a tensor product of two Pauli gates and acts on the readout qubit
def layerXX(W):
for i in range(0, num, 1):
qml.IsingXX(phi=W[0,i], wires=[0, i+1], do_queue=True, id=None)
def layerZZ(W):
for i in range(0, num, 1):
qml.IsingZZ(phi=W[1,i], wires=[0, i+1], do_queue=True, id=None)
# Set the default device for pennylane
dev = qml.device('default.qubit', wires=num+1)
# The quantum circuit
@qml.qnode(dev, interface='tf')
def circuit(inputs, weights):
# Quantum encoding method needed to process classic data
# Every input value is encoded by a single rotation
qml.AngleEmbedding(features=inputs, wires=range(1,num+1), rotation='X')
# Prepare readout qubit
qml.PauliX(wires=0)
qml.Hadamard(wires=0)
# Add entangling layers
layerXX(weights)
layerZZ(weights)
# Hadamard gate on the readout qubit since we want to measure in the X-basis
qml.Hadamard(wires=0)
# Standard basis measurement on the readout qubit to obtain an expectation value
return qml.expval(qml.PauliZ(wires=0))
# Function to compute hinge accuracy
def accuracy(label, predictions):
y_true = tf.squeeze(label) > 0.0
y_pred = tf.squeeze(predictions) > 0.0
result = tf.cast(y_true == y_pred, tf.float64)
return tf.reduce_mean(result)
# Define weight shapes
weight_shapes = {'weights': (2, num)}
# Visualize the circuit architecture with random parameters
random_weights = [[],[]]
for i in range(num):
random_weights[0].append(np.random.rand())
random_weights[1].append(np.random.rand())
weights = np.asarray(random_weights)
print(qml.draw(circuit)(weights = weights, inputs = 16*[np.random.rand]))
0: ──X───────────────────H─╭IsingXX(0.09)─╭IsingXX(0.23)─╭IsingXX(0.78)─╭IsingXX(0.62)
1: ─╭AngleEmbedding(M0)────╰IsingXX(0.09)─│──────────────│──────────────│─────────────
2: ─├AngleEmbedding(M0)───────────────────╰IsingXX(0.23)─│──────────────│─────────────
3: ─├AngleEmbedding(M0)──────────────────────────────────╰IsingXX(0.78)─│─────────────
4: ─├AngleEmbedding(M0)─────────────────────────────────────────────────╰IsingXX(0.62)
5: ─├AngleEmbedding(M0)───────────────────────────────────────────────────────────────
6: ─├AngleEmbedding(M0)───────────────────────────────────────────────────────────────
7: ─├AngleEmbedding(M0)───────────────────────────────────────────────────────────────
8: ─├AngleEmbedding(M0)───────────────────────────────────────────────────────────────
9: ─├AngleEmbedding(M0)───────────────────────────────────────────────────────────────
10: ─├AngleEmbedding(M0)───────────────────────────────────────────────────────────────
11: ─├AngleEmbedding(M0)───────────────────────────────────────────────────────────────
12: ─├AngleEmbedding(M0)───────────────────────────────────────────────────────────────
13: ─├AngleEmbedding(M0)───────────────────────────────────────────────────────────────
14: ─├AngleEmbedding(M0)───────────────────────────────────────────────────────────────
15: ─├AngleEmbedding(M0)───────────────────────────────────────────────────────────────
16: ─╰AngleEmbedding(M0)───────────────────────────────────────────────────────────────
──╭IsingXX(0.22)─╭IsingXX(0.79)─╭IsingXX(0.56)─╭IsingXX(0.42)─╭IsingXX(0.03)─╭IsingXX(0.24)
──│──────────────│──────────────│──────────────│──────────────│──────────────│─────────────
──│──────────────│──────────────│──────────────│──────────────│──────────────│─────────────
──│──────────────│──────────────│──────────────│──────────────│──────────────│─────────────
──│──────────────│──────────────│──────────────│──────────────│──────────────│─────────────
──╰IsingXX(0.22)─│──────────────│──────────────│──────────────│──────────────│─────────────
─────────────────╰IsingXX(0.79)─│──────────────│──────────────│──────────────│─────────────
────────────────────────────────╰IsingXX(0.56)─│──────────────│──────────────│─────────────
───────────────────────────────────────────────╰IsingXX(0.42)─│──────────────│─────────────
──────────────────────────────────────────────────────────────╰IsingXX(0.03)─│─────────────
─────────────────────────────────────────────────────────────────────────────╰IsingXX(0.24)
───────────────────────────────────────────────────────────────────────────────────────────
───────────────────────────────────────────────────────────────────────────────────────────
───────────────────────────────────────────────────────────────────────────────────────────
───────────────────────────────────────────────────────────────────────────────────────────
───────────────────────────────────────────────────────────────────────────────────────────
───────────────────────────────────────────────────────────────────────────────────────────
──╭IsingXX(0.12)─╭IsingXX(0.52)─╭IsingXX(0.99)─╭IsingXX(0.14)─╭IsingXX(0.90)─╭IsingXX(0.08)
──│──────────────│──────────────│──────────────│──────────────│──────────────│─────────────
──│──────────────│──────────────│──────────────│──────────────│──────────────│─────────────
──│──────────────│──────────────│──────────────│──────────────│──────────────│─────────────
──│──────────────│──────────────│──────────────│──────────────│──────────────│─────────────
──│──────────────│──────────────│──────────────│──────────────│──────────────│─────────────
──│──────────────│──────────────│──────────────│──────────────│──────────────│─────────────
──│──────────────│──────────────│──────────────│──────────────│──────────────│─────────────
──│──────────────│──────────────│──────────────│──────────────│──────────────│─────────────
──│──────────────│──────────────│──────────────│──────────────│──────────────│─────────────
──│──────────────│──────────────│──────────────│──────────────│──────────────│─────────────
──╰IsingXX(0.12)─│──────────────│──────────────│──────────────│──────────────│─────────────
─────────────────╰IsingXX(0.52)─│──────────────│──────────────│──────────────│─────────────
────────────────────────────────╰IsingXX(0.99)─│──────────────│──────────────│─────────────
───────────────────────────────────────────────╰IsingXX(0.14)─│──────────────│─────────────
──────────────────────────────────────────────────────────────╰IsingXX(0.90)─│─────────────
─────────────────────────────────────────────────────────────────────────────╰IsingXX(0.08)
──╭IsingZZ(0.59)─╭IsingZZ(0.90)─╭IsingZZ(0.70)─╭IsingZZ(0.86)─╭IsingZZ(0.87)─╭IsingZZ(0.75)
──╰IsingZZ(0.59)─│──────────────│──────────────│──────────────│──────────────│─────────────
─────────────────╰IsingZZ(0.90)─│──────────────│──────────────│──────────────│─────────────
────────────────────────────────╰IsingZZ(0.70)─│──────────────│──────────────│─────────────
───────────────────────────────────────────────╰IsingZZ(0.86)─│──────────────│─────────────
──────────────────────────────────────────────────────────────╰IsingZZ(0.87)─│─────────────
─────────────────────────────────────────────────────────────────────────────╰IsingZZ(0.75)
───────────────────────────────────────────────────────────────────────────────────────────
───────────────────────────────────────────────────────────────────────────────────────────
───────────────────────────────────────────────────────────────────────────────────────────
───────────────────────────────────────────────────────────────────────────────────────────
───────────────────────────────────────────────────────────────────────────────────────────
───────────────────────────────────────────────────────────────────────────────────────────
───────────────────────────────────────────────────────────────────────────────────────────
───────────────────────────────────────────────────────────────────────────────────────────
───────────────────────────────────────────────────────────────────────────────────────────
───────────────────────────────────────────────────────────────────────────────────────────
──╭IsingZZ(0.25)─╭IsingZZ(0.79)─╭IsingZZ(0.78)─╭IsingZZ(0.14)─╭IsingZZ(0.03)─╭IsingZZ(0.22)
──│──────────────│──────────────│──────────────│──────────────│──────────────│─────────────
──│──────────────│──────────────│──────────────│──────────────│──────────────│─────────────
──│──────────────│──────────────│──────────────│──────────────│──────────────│─────────────
──│──────────────│──────────────│──────────────│──────────────│──────────────│─────────────
──│──────────────│──────────────│──────────────│──────────────│──────────────│─────────────
──│──────────────│──────────────│──────────────│──────────────│──────────────│─────────────
──╰IsingZZ(0.25)─│──────────────│──────────────│──────────────│──────────────│─────────────
─────────────────╰IsingZZ(0.79)─│──────────────│──────────────│──────────────│─────────────
────────────────────────────────╰IsingZZ(0.78)─│──────────────│──────────────│─────────────
───────────────────────────────────────────────╰IsingZZ(0.14)─│──────────────│─────────────
──────────────────────────────────────────────────────────────╰IsingZZ(0.03)─│─────────────
─────────────────────────────────────────────────────────────────────────────╰IsingZZ(0.22)
───────────────────────────────────────────────────────────────────────────────────────────
───────────────────────────────────────────────────────────────────────────────────────────
───────────────────────────────────────────────────────────────────────────────────────────
───────────────────────────────────────────────────────────────────────────────────────────
──╭IsingZZ(0.57)─╭IsingZZ(0.69)─╭IsingZZ(0.75)─╭IsingZZ(0.90)──H─┤ <Z>
──│──────────────│──────────────│──────────────│─────────────────┤
──│──────────────│──────────────│──────────────│─────────────────┤
──│──────────────│──────────────│──────────────│─────────────────┤
──│──────────────│──────────────│──────────────│─────────────────┤
──│──────────────│──────────────│──────────────│─────────────────┤
──│──────────────│──────────────│──────────────│─────────────────┤
──│──────────────│──────────────│──────────────│─────────────────┤
──│──────────────│──────────────│──────────────│─────────────────┤
──│──────────────│──────────────│──────────────│─────────────────┤
──│──────────────│──────────────│──────────────│─────────────────┤
──│──────────────│──────────────│──────────────│─────────────────┤
──│──────────────│──────────────│──────────────│─────────────────┤
──╰IsingZZ(0.57)─│──────────────│──────────────│─────────────────┤
─────────────────╰IsingZZ(0.69)─│──────────────│─────────────────┤
────────────────────────────────╰IsingZZ(0.75)─│─────────────────┤
───────────────────────────────────────────────╰IsingZZ(0.90)────┤
# Set the default float type for the keras backend
tf.keras.backend.set_floatx('float32')
# Create the quantum quantum layer for the keras model
quantum_layer = qml.qnn.KerasLayer(circuit, weight_shapes, output_dim=1, dtype='float32')
# Create the keras model
model = tf.keras.models.Sequential([quantum_layer])
# Compile the model
model.compile(tf.keras.optimizers.Adam(), loss=tf.keras.losses.SquaredHinge(), metrics=[accuracy])
Now, we encode the satellite images with the trained encoder and normalize them suitable for the angle embedding. Further, we change our labels from [0,1] to hinge labels [-1,1] which is suggested for the quantum circuit in [3].
# Function for rescaling the labels to a range between 0 and 2pi
def get_angles(x):
x_ = (x + abs(x.min()))*(2*np.pi/(x.max()+abs(x.min())))
return x_.astype('float64')
# Since we can run out of memory easily when loading all images at the same time, we encode the training data in batches
def batch_encode(autoencoder, x, num_batches):
cut = int(len(x) / num_batches)
encoded = []
j = 0
for i in range(1, num_batches + 1):
encoded.extend(autoencoder.encoder(x[j*cut:i*cut]))
j = i
return np.asarray(encoded)
# Reduce every image to 16 values with the previously trained encoder
x_train_enc = batch_encode(ae, x_train, 32)
x_test_enc = batch_encode(ae, x_test, 8)
# Normalize the values
x_train_enc = get_angles(x_train_enc)
x_test_enc = get_angles(x_test_enc)
# Transform labels from [0, 1] to [-1, 1] since we will use hinge accuracy as metric
y_train = y_train*2-1
y_test = y_test*2-1
Simulating a quantum system on classic hardware is computational expensive. Depending on your CPU, training a model with 17 qubits over 5 epochs can take from several minutes up to an hour.
hybrid = model.fit(x_train_enc,
y_train,
epochs = 5,
shuffle = False,
batch_size = 16,
validation_data = (x_test_enc, y_test))
Epoch 1/5
70/70 [==============================] - 645s 9s/step - loss: 1.1713 - accuracy: 0.5000 - val_loss: 1.0309 - val_accuracy: 0.5000
Epoch 2/5
70/70 [==============================] - 641s 9s/step - loss: 0.9126 - accuracy: 0.5938 - val_loss: 0.8732 - val_accuracy: 0.6910
Epoch 3/5
70/70 [==============================] - 638s 9s/step - loss: 0.8255 - accuracy: 0.7616 - val_loss: 0.8113 - val_accuracy: 0.7292
Epoch 4/5
70/70 [==============================] - 638s 9s/step - loss: 0.7785 - accuracy: 0.7741 - val_loss: 0.7696 - val_accuracy: 0.7431
Epoch 5/5
70/70 [==============================] - 637s 9s/step - loss: 0.7444 - accuracy: 0.7884 - val_loss: 0.7391 - val_accuracy: 0.7604
plt.figure(figsize=(10, 5))
plt.plot(hybrid.history['loss'], label='Train loss')
plt.plot(hybrid.history['val_loss'], label='Validation loss')
plt.title('Loss')
plt.xlabel('Epoch')
plt.ylabel('Loss')
plt.legend()
<matplotlib.legend.Legend at 0x7f3efc498820>
plt.figure(figsize=(10, 5))
plt.plot(hybrid.history['accuracy'], label='Train accuracy')
plt.plot(hybrid.history['val_accuracy'], label='Validation accuracy')
plt.title('Accuracy')
plt.xlabel('Epoch')
plt.ylabel('Accuracy')
plt.legend()
<matplotlib.legend.Legend at 0x7f3edc3da3d0>
We trained a quantum machine learning model for binary classification of satellite imagery and demonstrated the potential use of quantum machine learning for this task. Although large images with 256x256x3 values were mapped to only 16 data qubits, sufficient accuracy for binary classification could be achieved. However, a larger number of epochs for both the autoencoder model and the quantum classifier could improve the classifcation accuracy further.
We encourage you to test other combinations of classes or modify the code and start your own experiments. For example the latent dimension of the encoder and number of qubits, architecture of the convolutional autoencoder, quantum encoding method, and circuit architecture heavily influence the results and changes may lead to new insights.
[1] J. M. Zollner. Quantum Classifiers for Remote Sensing. 2022. Quantum classifiers for remote sensing. In Proceedings of the 30th International Conference on Advances in Geographic Information Systems (SIGSPATIAL ‘22). Association for Computing Machinery, New York, NY, USA, Article 117, 1–2. doi: 10.1145/3557915.3565537
[2] G. Cheng, J. Han, X. Lu. 2017. Remote sensing image scene classification: Benchmark and state of the art. Proceedings of the IEEE 105.10 (2017): 1865-1883. doi: 10.1109/JPROC.2017.2675998
[3] E. Farhi, H. Neven. 2018. Classification with Quantum Neural Networks on Near Term Processors. doi: 10.48550/ARXIV.1802.06002