问题描述
我创建了两个以均匀分布初始化的 Tensor
(即:x1, y2),但是当我打印出结果时,它们并不是我所期望的.
I create two Tensors
(namely: x1, y2) which initialized with uniform distribution, But when I print out the result they were not what I expected.
这是我的代码:
x1 = tf.random_uniform([1], 0, 10, tf.int32)
y1 = tf.random_uniform([1], 0, 10, tf.int32)
subtraction = x1 - y1
with tf.Session() as sess:
print(sess.run(x1))
print(sess.run(y1))
print(sess.run(subtraction))
结果如下:
[6]
[2]
[0]
推荐答案
在您的代码中,x1
和 y1
是随机数生成器.每次调用它们时都取不同的值.所以当你调用 subtraction
时,它依次调用你的数字生成器 x1
和 y1
,没有理由获得与之前一致的结果电话.
In your code, x1
and y1
are random number generators. They take different values each time they are called. So when you call subtraction
, which in turns call your number generators x1
and y1
, there is no reason to obtain results that are consistent with previous calls.
要实现您的目标,请将值存储在 Variable
中:
To achieve what you are looking for, store the values in a Variable
:
import tensorflow as tf
x1 = tf.Variable(tf.random_uniform([1], 0, 10, tf.int32))
y1 = tf.Variable(tf.random_uniform([1], 0, 10, tf.int32))
subtraction = x1 - y1
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
print(sess.run(x1))
print(sess.run(y1))
print(sess.run(subtraction))
或者,如果您不需要迭代之间的持久性并且可以一次调用依赖于您的数字生成器的所有运算符,请将它们打包到对 sess.run
的同一个调用中:
Alternatively, if you don't need persistence between iterations and can call all the operators relying on your number generators at once, pack them into the same call to sess.run
:
import tensorflow as tf
x1 = tf.random_uniform([1], 0, 10, tf.int32)
y1 = tf.random_uniform([1], 0, 10, tf.int32)
subtraction = x1 - y1
with tf.Session() as sess:
print(sess.run([x1, y1, subtraction]))
这篇关于在 Tensorflow 中对随机变量的操作无法正常工作的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!