本文介绍了使用高级 tf.layers 时添加 L2 正则化的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

是否可以在使用 tf.layers 中定义的层时添加 L2 正则化?

Is it possible to add an L2 regularization when using the layers defined in tf.layers?

在我看来,由于 tf.layers 是一个高级包装器,因此没有简单的方法可以访问过滤器权重.

It seems to me that since tf.layers is an high level wrapper, there is no easy way to get access to the filter weights.

使用 tf.nn.conv2d

regularizer = tf.contrib.layers.l2_regularizer(scale=0.1)

weights = tf.get_variable(
    name="weights",
    regularizer=regularizer
)

#Previous layers

...

#Second layer 
layer 2 = tf.nn.conv2d(
input,
weights,
[1,1,1,1],
[1,1,1,1])

#More layers
...

#Loss
loss = #some loss

reg_variables = tf.get_collection(tf.GraphKeys.REGULARIZATION_LOSSES)
reg_term = tf.contrib.layers.apply_regularization(regularizer, reg_variables)
loss += reg_term

现在使用 tf.layers.conv2d 会是什么样子?

谢谢!

推荐答案

您可以将它们传递到 tf.layers.conv2d 作为参数:

You can pass them into tf.layers.conv2d as arguments:

regularizer = tf.contrib.layers.l2_regularizer(scale=0.1)
layer2 = tf.layers.conv2d(
    inputs,
    filters,
    kernel_size,
    kernel_regularizer=regularizer)

那么你应该像这样将正则化损失添加到你的损失中:

Then you should add the regularization loss to your loss like this:

l2_loss = tf.losses.get_regularization_loss()
loss += l2_loss

感谢 Zeke Arneodo、Tom 和 srcolinas,我添加了您反馈的最后一点,以便接受的答案提供完整的解决方案.

这篇关于使用高级 tf.layers 时添加 L2 正则化的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

10-29 07:34