问题描述
我无法按名称恢复张量,我什至不知道是否可行.
I'm having trouble recovering a tensor by name, I don't even know if it's possible.
我有一个创建图表的函数:
I have a function that creates my graph:
def create_structure(tf, x, input_size,dropout):
with tf.variable_scope("scale_1") as scope:
W_S1_conv1 = deep_dive.weight_variable_scaling([7,7,3,64], name='W_S1_conv1')
b_S1_conv1 = deep_dive.bias_variable([64])
S1_conv1 = tf.nn.relu(deep_dive.conv2d(x_image, W_S1_conv1,strides=[1, 2, 2, 1], padding='SAME') + b_S1_conv1, name="Scale1_first_relu")
.
.
.
return S3_conv1,regularizer
我想在这个函数之外访问变量 S1_conv1.我试过了:
I want to access the variable S1_conv1 outside this function. I tried:
with tf.variable_scope('scale_1') as scope_conv:
tf.get_variable_scope().reuse_variables()
ft=tf.get_variable('Scale1_first_relu')
但这给了我一个错误:
ValueError: Under-sharing: Variable scale_1/Scale1_first_relu 不存在,不允许.您的意思是在 VarScope 中设置重用=无吗?
ValueError: Under-sharing: Variable scale_1/Scale1_first_relu does not exist, disallowed. Did you mean to set reuse=None in VarScope?
但这有效:
with tf.variable_scope('scale_1') as scope_conv:
tf.get_variable_scope().reuse_variables()
ft=tf.get_variable('W_S1_conv1')
我可以解决这个问题
return S3_conv1,regularizer, S1_conv1
但我不想那样做.
我认为我的问题是 S1_conv1 并不是一个真正的变量,它只是一个张量.有没有办法做我想做的事?
I think my problem is that S1_conv1 is not really a variable, it's just a tensor. Is there a way to do what I want?
推荐答案
有一个函数 tf.Graph.get_tensor_by_name().例如:
There is a function tf.Graph.get_tensor_by_name(). For instance:
import tensorflow as tf
c = tf.constant([[1.0, 2.0], [3.0, 4.0]])
d = tf.constant([[1.0, 1.0], [0.0, 1.0]])
e = tf.matmul(c, d, name='example')
with tf.Session() as sess:
test = sess.run(e)
print e.name #example:0
test = tf.get_default_graph().get_tensor_by_name("example:0")
print test #Tensor("example:0", shape=(2, 2), dtype=float32)
这篇关于Tensorflow:如何按名称获取张量?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!