本文介绍了有没有办法冻结 KerasLayer 中的特定图层?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我目前正在构建一个使用迁移学习对图像进行分类的 CNN.在我的模型中,有一个使用 EfficientNet 的 tensorflow-hub KerasLayer 来创建特征向量.

I'm currently building a CNN that uses transfer learning to classify images.In my model, there is a tensorflow-hub KerasLayer that uses EfficientNet in order to create a feature vector.

我的代码在这里:

model = models.Sequential([
hub.KerasLayer("https://tfhub.dev/google/efficientnet/b7/feature-vector/1", trainable=True), # Trainable
layers.Dropout(DROPOUT),
layers.Dense(NEURONS_PER_LAYER, kernel_regularizer=tf.keras.regularizers.l2(REG_LAMBDA), activation=ACTIVATION),
layers.Dropout(DROPOUT),
layers.Dense(NEURONS_PER_LAYER, kernel_regularizer=tf.keras.regularizers.l2(REG_LAMBDA), activation=ACTIVATION),
layers.Dropout(DROPOUT),
layers.Dense(NEURONS_PER_LAYER, kernel_regularizer=tf.keras.regularizers.l2(REG_LAMBDA), activation=ACTIVATION),
layers.Dropout(DROPOUT),
layers.Dense(NEURONS_PER_LAYER, kernel_regularizer=tf.keras.regularizers.l2(REG_LAMBDA), activation=ACTIVATION),
layers.Dropout(DROPOUT),
layers.Dense(1, activation="sigmoid")])

我可以冻结或解冻整个 KerasLayer,但我似乎无法找到一种方法来只冻结较早的层并微调更高级别的部分.有人可以帮忙吗?

I can freeze or unfreeze the entire KerasLayer, but I can't seem to find a way to only freeze the earlier layers and fine-tune the higher-level parts. Can anyone help?

推荐答案

您可以使用 layer.trainable = False 冻结整个图层.万一您碰巧加载整个模型或从头开始创建模型,您可以执行此循环以找到要冻结的特定层.

You can freeze entire layer by using layer.trainable = False. Just in case you happen to load entire model or create a model from scratch you can do this loop to find specific a layer to freeze.

# load a model or create a model
model = Model(...)

# first you print out your model summary
model.summary()

# you will get something like this
'''
Model: "sequential_2"
_________________________________________________________________
Layer (type)                 Output Shape              Param #
=================================================================
inception_resnet_v2 (Model)  (None, 2, 2, 1536)        54336736
_________________________________________________________________
flatten_2 (Flatten)          (None, 6144)              0
_________________________________________________________________
dropout_2 (Dropout)          (None, 6144)              0
_________________________________________________________________
dense_8 (Dense)              (None, 2048)              12584960
_________________________________________________________________
dense_9 (Dense)              (None, 1024)              2098176
_________________________________________________________________
dense_10 (Dense)             (None, 512)               524800
_________________________________________________________________
dense_11 (Dense)             (None, 17)                8721
=================================================================
'''

# here is loop for freezing particular layer (dense_10 in this example)
for layer in model.layers:
    # selecting layer by name
    if layer.name == 'dense_10':
        layer.trainable = False

# for that hub layer you need to create hub layer outside your model just for easy access

# my inception layer
inception_layer = keras.applications.InceptionResNetV2(weights='imagenet', include_top=False, input_shape=(128, 128, 3))

# create model
model.add(inception_layer)

# same trick
inception_layer.summary()

# here is same loop from upper example
for layer in inception_layer.layers:
    # selecting layer by name
    if layer.name == 'block8_10_conv':
        layer.trainable = False

这篇关于有没有办法冻结 KerasLayer 中的特定图层?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-18 15:37