Customized Model
!pip install autokeras
import keras
import numpy as np
import tree
from keras.datasets import mnist
import autokeras as ak
In this tutorial, we show how to customize your search space with AutoModel and how to implement your own block as search space. This API is mainly for advanced users who already know what their model should look like.
Customized Search Space
First, let us see how we can build the following neural network using the building blocks in AutoKeras.
We can make use of the AutoModel API in
AutoKeras to implemented as follows.
The usage is the same as the Keras functional
API.
Since this is just a demo, we use small amount of max_trials
and epochs
.
input_node = ak.ImageInput()
output_node = ak.Normalization()(input_node)
output_node1 = ak.ConvBlock()(output_node)
output_node2 = ak.ResNetBlock(version="v2")(output_node)
output_node = ak.Merge()([output_node1, output_node2])
output_node = ak.ClassificationHead()(output_node)
auto_model = ak.AutoModel(
inputs=input_node, outputs=output_node, overwrite=True, max_trials=1
)
Whild building the model, the blocks used need to follow this topology:
Preprocessor
-> Block
-> Head
. Normalization
and ImageAugmentation
are Preprocessor
s.
ClassificationHead
is Head
. The rest are Block
s.
In the code above, we use ak.ResNetBlock(version='v2')
to specify the version
of ResNet to use. There are many other arguments to specify for each building
block. For most of the arguments, if not specified, they would be tuned
automatically. Please refer to the documentation links at the bottom of the
page for more details.
Then, we prepare some data to run the model.
(x_train, y_train), (x_test, y_test) = mnist.load_data()
print(x_train.shape) # (60000, 28, 28)
print(y_train.shape) # (60000,)
print(y_train[:3]) # array([7, 2, 1], dtype=uint8)
# Feed the AutoModel with training data.
auto_model.fit(x_train[:100], y_train[:100], epochs=1)
# Predict with the best model.
predicted_y = auto_model.predict(x_test)
# Evaluate the best model with testing data.
print(auto_model.evaluate(x_test, y_test))
For multiple input nodes and multiple heads search space, you can refer to this section.
Validation Data
If you would like to provide your own validation data or change the ratio of the validation data, please refer to the Validation Data section of the tutorials of Image Classification, Text Classification, Multi-task and Multiple Validation.
Data Format
You can refer to the documentation of ImageInput, TextInput, RegressionHead, ClassificationHead, for the format of different types of data. You can also refer to the Data Format section of the tutorials of Image Classification, Text Classification.
Implement New Block
You can extend the Block class to implement your own building blocks and use it with AutoModel.
The first step is to learn how to write a build function for KerasTuner. You need to override the build function of the block. The following example shows how to implement a single Dense layer block whose number of neurons is tunable.
class SingleDenseLayerBlock(ak.Block):
def build(self, hp, inputs=None):
# Get the input_node from inputs.
input_node = tree.flatten(inputs)[0]
layer = keras.layers.Dense(
hp.Int("num_units", min_value=32, max_value=512, step=32)
)
output_node = layer(input_node)
return output_node
You can connect it with other blocks and build it into an AutoModel.
# Build the AutoModel
input_node = ak.Input()
output_node = SingleDenseLayerBlock()(input_node)
output_node = ak.RegressionHead()(output_node)
auto_model = ak.AutoModel(input_node, output_node, overwrite=True, max_trials=1)
# Prepare Data
num_instances = 100
x_train = np.random.rand(num_instances, 20).astype(np.float32)
y_train = np.random.rand(num_instances, 1).astype(np.float32)
x_test = np.random.rand(num_instances, 20).astype(np.float32)
y_test = np.random.rand(num_instances, 1).astype(np.float32)
# Train the model
auto_model.fit(x_train, y_train, epochs=1)
print(auto_model.evaluate(x_test, y_test))
Reference
Nodes: ImageInput, Input, TextInput.
Preprocessors: FeatureEngineering, ImageAugmentation, LightGBM, Normalization,
Blocks: ConvBlock, DenseBlock, Merge, ResNetBlock, RNNBlock, SpatialReduction, TemporalReduction, XceptionBlock, ImageBlock, TextBlock.