使用tensorflow 1.11.0时出现此错误消息

[['model', '300000']]
Jan 01 03:24 test.py[line:53] INFO Test model/model.ckpt-300000.
Jan 01 03:24 test.py[line:57] INFO Test data/test.1.txt with beam_size = 1
Jan 01 03:24 data_util.py[line:17] INFO Try load dict from data/doc_dict.txt.
Jan 01 03:24 data_util.py[line:33] INFO Load dict data/doc_dict.txt with 30000 words.
Jan 01 03:24 data_util.py[line:17] INFO Try load dict from data/sum_dict.txt.
Jan 01 03:24 data_util.py[line:33] INFO Load dict data/sum_dict.txt with 30000 words.
Jan 01 03:24 data_util.py[line:172] INFO Load test document from data/test.1.txt.
Jan 01 03:24 data_util.py[line:178] INFO Load 1 testing documents.
Jan 01 03:24 data_util.py[line:183] INFO Doc dict covers 75.61% words.
2019-01-01 03:24:51.426388: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
Jan 01 03:24 summarization.py[line:195] INFO Creating 1 layers of 400 units.
Traceback (most recent call last):
  File "src/summarization.py", line 241, in <module>
    tf.app.run()
  File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/tensorflow/python/platform/app.py", line 125, in run
    _sys.exit(main(argv))
  File "src/summarization.py", line 229, in main
    decode()
  File "src/summarization.py", line 196, in decode
    model = create_model(sess, True)
  File "src/summarization.py", line 75, in create_model
    dtype=dtype)
  File "/TensorFlow-Summarization/src/bigru_model.py", line 89, in __init__
    wrapper_state = tf.contrib.seq2seq.DynamicAttentionWrapperState(
AttributeError: module 'tensorflow.contrib.seq2seq' has no attribute 'DynamicAttentionWrapperState'
Jan 01 03:24 test.py[line:57] INFO Test data/test.1.txt with beam_size = 10
Jan 01 03:25 data_util.py[line:17] INFO Try load dict from data/doc_dict.txt.
Jan 01 03:25 data_util.py[line:33] INFO Load dict data/doc_dict.txt with 30000 words.
Jan 01 03:25 data_util.py[line:17] INFO Try load dict from data/sum_dict.txt.
Jan 01 03:25 data_util.py[line:33] INFO Load dict data/sum_dict.txt with 30000 words.
Jan 01 03:25 data_util.py[line:172] INFO Load test document from data/test.1.txt.
Jan 01 03:25 data_util.py[line:178] INFO Load 1 testing documents.
Jan 01 03:25 data_util.py[line:183] INFO Doc dict covers 75.61% words.
2019-01-01 03:25:02.643185: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
Jan 01 03:25 summarization.py[line:195] INFO Creating 1 layers of 400 units.
Traceback (most recent call last):
  File "src/summarization.py", line 241, in <module>
    tf.app.run()
  File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/tensorflow/python/platform/app.py", line 125, in run
    _sys.exit(main(argv))
  File "src/summarization.py", line 229, in main
    decode()
  File "src/summarization.py", line 196, in decode
    model = create_model(sess, True)
  File "src/summarization.py", line 75, in create_model
    dtype=dtype)
  File "/TensorFlow-Summarization/src/bigru_model.py", line 89, in __init__
    wrapper_state = tf.contrib.seq2seq.DynamicAttentionWrapperState(
AttributeError: module 'tensorflow.contrib.seq2seq' has no attribute 'DynamicAttentionWrapperState'




wrapper_state = tf.contrib.seq2seq.DynamicAttentionWrapperState(
                    self.init_state, self.prev_att)

最佳答案

这可能是因为根据documentation,API 1.11.0中的DynamicAttentionWrapper中没有tf.contrib.seq2seq

他们在release 1.3.0中添加了单调注意包装

关于python - AttributeError:模块“tensorflow.contrib.seq2seq”没有属性“DynamicAttentionWrapperState”,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/53990810/

10-11 04:05