问题描述
我正在使用tf.layers
重写tf.contrib.slim.nets.inception_v3
.不幸的是,新的tf.layers
模块不能与arg_scope
一起使用,因为它没有必要的装饰器.是否有更好的机制可以用来设置图层的默认参数?还是我应该在每个图层上添加适当的参数并删除arg_scope
?
I'm rewriting tf.contrib.slim.nets.inception_v3
using tf.layers
. Unfortunately the new tf.layers
module does not work with arg_scope
, as it does not have the necessary decorators. Is there better mechanism in place that I should use to set default paramters for layers? Or should I simply add a proper arguments to each layer and remove the arg_scope
?
以下是使用arg_scope的示例:
Here is an example that uses the arg_scope:
with variable_scope.variable_scope(scope, 'InceptionV3', [inputs]):
with arg_scope(
[layers.conv2d, layers_lib.max_pool2d, layers_lib.avg_pool2d],
stride=1,
padding='VALID'):
推荐答案
没有其他机制可以让您在核心TensorFlow中定义默认值,因此您应该为每个图层指定参数.
There isn't another mechanism that lets you define default values in core TensorFlow, so you should specify the arguments for each layer.
例如,此代码:
with slim.arg_scope([slim.fully_connected],
activation_fn=tf.nn.relu,
weights_initializer=tf.truncated_normal_initializer(stddev=0.01),
weights_regularizer=tf.contrib.layers.l2_regularizer(scale=0.0005)):
x = slim.fully_connected(x, 800)
x = slim.fully_connected(x, 1000)
将成为:
x = tf.layers.dense(x, 800, activation=tf.nn.relu,
kernel_initializer=tf.truncated_normal_initializer(stddev=0.01),
kernel_regularizer=tf.contrib.layers.l2_regularizer(scale=0.0005))
x = tf.layers.dense(x, 1000, activation=tf.nn.relu,
kernel_initializer=tf.truncated_normal_initializer(stddev=0.01),
kernel_regularizer=tf.contrib.layers.l2_regularizer(scale=0.0005))
或者:
with tf.variable_scope('fc',
initializer=tf.truncated_normal_initializer(stddev=0.01)):
x = tf.layers.dense(x, 800, activation=tf.nn.relu,
kernel_regularizer=tf.contrib.layers.l2_regularizer(scale=0.0005))
x = tf.layers.dense(x, 1000, activation=tf.nn.relu,
kernel_regularizer=tf.contrib.layers.l2_regularizer(scale=0.0005))
请务必阅读该层的文档,以了解哪些初始化程序默认为变量作用域初始化程序.例如,密集层的kernel_initializer
使用变量作用域初始化程序,而bias_initializer
使用tf.zeros_initializer()
.
Make sure to read the documentation of the layer to see which initializers default to the variable scope initializer. For example, the dense layer's kernel_initializer
uses the variable scope initializer, while the bias_initializer
uses tf.zeros_initializer()
.
这篇关于使用tf.layers时可替代arg_scope的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!