我正在使用Tensorflow==2.0.0a0并想运行以下脚本:

import tensorflow as tf
import tensorboard
import pandas as pd
import matplotlib.pyplot as plt
import numpy as np
import tensorflow_probability as tfp
from tensorflow_model_optimization.sparsity import keras as sparsity
from tensorflow import keras

tfd = tfp.distributions

init = tf.global_variables_initializer()

with tf.Session() as sess:
    sess.run(init)

    model = tf.keras.Sequential([
      tf.keras.layers.Dense(1,kernel_initializer='glorot_uniform'),
      tfp.layers.DistributionLambda(lambda t: tfd.Normal(loc=t, scale=1))
    ])

我所有的较旧笔记本均使用TF 1.13。但是,我想开发一个笔记本,在其中使用需要Tensorflow > 1.13的模型优化(神经网络修剪)+ TF概率。

所有库都已导入,但是init = tf.global_variables_initializer()生成错误:
AttributeError: module 'tensorflow' has no attribute 'global_variables_initializer'

另外,tf.Session()也会生成错误:
AttributeError: module 'tensorflow' has no attribute 'Session'

因此,我想这可能与 Tensorflow 本身有关,但是我在Anaconda环境中没有较早的版本。

库版本的输出:
tf.__version__
Out[16]: '2.0.0-alpha0'

tfp.__version__
Out[17]: '0.7.0-dev20190517'

keras.__version__
Out[18]: '2.2.4-tf'

关于这个问题有什么想法吗?

最佳答案

Tensorflow 2.0脱离了 session ,转而渴望执行。如果您引用tf.compat库并禁用急切执行,仍可以使用 session 运行代码:

import tensorflow as tf
import tensorboard
import pandas as pd
import matplotlib.pyplot as plt
import numpy as np
import tensorflow_probability as tfp
from tensorflow_model_optimization.sparsity import keras as sparsity
from tensorflow import keras


tf.compat.v1.disable_eager_execution()


tfd = tfp.distributions

init = tf.compat.v1.global_variables_initializer()

with tf.compat.v1.Session() as sess:
    sess.run(init)

    model = tf.keras.Sequential([
      tf.keras.layers.Dense(1,kernel_initializer='glorot_uniform'),
      tfp.layers.DistributionLambda(lambda t: tfd.Normal(loc=t, scale=1))
    ])

您可以使用以下方式以这种方式转换任何python脚本:
tf_upgrade_v2 --infile in.py --outfile out.py

关于python - Tensorflow == 2.0.0a0-AttributeError : module 'tensorflow' has no attribute 'global_variables_initializer' ,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/56192998/

10-16 10:58