Skip to content Skip to sidebar Skip to footer

Adding New Units To A Keras Model Layer And Changing Their Weights

I am working on a project that requires me to add new units to the output layer of a neural network to implement a form of transfer learning. I was wondering if I could do this and

Solution 1:

Stumbled upon the answer to my own question. Thanks everyone for the answers/comments.

https://keras.io/layers/about-keras-layers/

The first few lines of this source detail how to load and set weights. Essentially, appending an output neuron to a Keras model can be accomplished by loading the old output layer, appending the new weights, and setting weights for a new layer. Code is below.

# Load weights of previous output layer, set weights for new layer
old_layer_weights = model.layers.pop().get_weights()
new_neuron_weights = np.ndarray(shape=[1,bottleneck_size])

# Set new weights

# Append new weights, add new layer
new_layer = Dense(num_classes).set_weights(np.append(old_layer_weights,new_neuron_weights))
model.add(new_layer)

Solution 2:

You could add new units to the output layer of a pre-trained neural network. This form of transfer learning is said to be called using the bottleneck features of a pre-trained network. This could be implemented both in tensorflow as well as in Keras.

Please find the tutorial in Keras below: https://blog.keras.io/building-powerful-image-classification-models-using-very-little-data.html

Also, find the tutorial for tensorflow below:

https://github.com/Hvass-Labs/TensorFlow-Tutorials/blob/master/08_Transfer_Learning.ipynb

Hope this helps!


Post a Comment for "Adding New Units To A Keras Model Layer And Changing Their Weights"