lookup函数有什么作用

lookup函数有什么作用

本文介绍了tf.nn.embedding_lookup函数有什么作用?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

tf.nn.embedding_lookup(params, ids, partition_strategy='mod', name=None)

我不了解此功能的职责.像查找表吗?

I cannot understand the duty of this function. Is it like a lookup table? Which means to return the parameters corresponding to each id (in ids)?

例如,在skip-gram模型中,如果我们使用tf.nn.embedding_lookup(embeddings, train_inputs),那么对于每个train_input,它会找到对应的嵌入?

For instance, in the skip-gram model if we use tf.nn.embedding_lookup(embeddings, train_inputs), then for each train_input it finds the correspond embedding?

推荐答案

embedding_lookup函数检索params张量的行.该行为类似于对numpy中的数组使用索引.例如

embedding_lookup function retrieves rows of the params tensor. The behavior is similar to using indexing with arrays in numpy. E.g.

matrix = np.random.random([1024, 64])  # 64-dimensional embeddings
ids = np.array([0, 5, 17, 33])
print matrix[ids]  # prints a matrix of shape [4, 64]

params参数也可以是张量的列表,在这种情况下,ids将在张量之间分布.例如,给定3个张量[2, 64]的列表,默认行为是它们将表示ids:[0, 3][1, 4][2, 5].

params argument can be also a list of tensors in which case the ids will be distributed among the tensors. For example, given a list of 3 tensors [2, 64], the default behavior is that they will represent ids: [0, 3], [1, 4], [2, 5].

partition_strategy控制ids在列表中的分配方式.当矩阵可能太大而无法保存为一个片段时,分区对于解决大规模问题很有用.

partition_strategy controls the way how the ids are distributed among the list. The partitioning is useful for larger scale problems when the matrix might be too large to keep in one piece.

这篇关于tf.nn.embedding_lookup函数有什么作用?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-28 22:06