本文介绍了使用预训练的 word2vec 初始化 Seq2seq 嵌入的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我对使用预训练的 word2vec 初始化 tensorflow seq2seq 实现感兴趣.

I am interested in initialising tensorflow seq2seq implementation with pretrained word2vec.

我看过代码.似乎嵌入已初始化

I have seen the code. It seems embedding is initialized

with tf.variable_scope(scope or "embedding_attention_decoder"):
with tf.device("/cpu:0"):
embedding = tf.get_variable("embedding", [num_symbols, cell.input_size])

如何将其更改为使用预训练的 word2vec 进行初始化??

how do I change this to initialise with pretrained word2vec??

推荐答案

我想你已经在邮件列表中得到了你的答案,但我把它放在这里供后人使用.

I think you've gotten your answer in the mailing list but I am putting it here for posterity.

https://groups.google.com/a/tensorflow.org/forum/#!topic/discuss/bH6S98NpIJE

您可以随机初始化它,然后执行以下操作:session.run(embedding.assign(my_word2vec_matrix))

这将覆盖初始值.

这似乎对我有用.我相信需要 trainable=False 来保持值固定?

This seems to work for me. I believe trainable=False is needed to keep the values fixed?

# load word2vec model (say from gensim)
model = load_model(FILENAME, binary=True)

# embedding matrix
X = model.syn0
print(type(X)) # numpy.ndarray
print(X.shape) # (vocab_size, embedding_dim)

# start interactive session
sess = tf.InteractiveSession()

# set embeddings
embeddings = tf.Variable(tf.random_uniform(X.shape, minval=-0.1, maxval=0.1), trainable=False)

# initialize
sess.run(tf.initialize_all_variables())

# override inits
sess.run(embeddings.assign(X))

这篇关于使用预训练的 word2vec 初始化 Seq2seq 嵌入的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

07-13 08:56