问题描述
我看到这里记录了这个tf.nn.relu":https://www.tensorflow.org/api_docs/python/tf/nn/relu
I see this "tf.nn.relu" documented here: https://www.tensorflow.org/api_docs/python/tf/nn/relu
但后来我也在model_fn"这个页面上看到了 tf.contrib.layers.relu 的用法:https://www.tensorflow.org/extend/estimators
But then I also see usage of tf.contrib.layers.relu on this page in "model_fn":https://www.tensorflow.org/extend/estimators
似乎后者不像第一个以类似 API 的方式描述,而只是在使用中呈现.
It seems like the latter isn't described like the first one in an API-like fashion, but only presented in use.
这是为什么?文档是否过时?为什么有两个 - 一个是旧的并且不再支持/将被删除?
Why is this? Are the docs out of date? Why have two - is one old and no longer supported/going to be removed?
推荐答案
它们不是一回事.
后者不是激活函数而是fully_connected
layer,其激活函数预设为nn.relu
:
The latter is not an activation function but a fully_connected
layer that has its activation function preset as nn.relu
:
relu = functools.partial(fully_connected, activation_fn=nn.relu)
# ^ |< >|
# |_ tf.contrib.layers.relu tf.nn.relu_|
如果您阅读了 contrib.layers
的文档a>,你会发现:
If you read the docs for contrib.layers
, you'll find:
fully_connected
的别名,它设置了默认的激活函数可用:relu
、relu6
和 linear
.
总而言之,tf.contrib.layers.relu 是 fully_connected
层具有 relu 激活,而 tf.nn.relu
是整流线性单元激活函数本身.
Summarily, tf.contrib.layers.relu
is an alias for a fully_connected
layer with relu activation while tf.nn.relu
is the REctified Linear Unit activation function itself.
这篇关于tf.nn.relu 与 tf.contrib.layers.relu 对比?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!