本文介绍了张量板嵌入挂起“计算 PCA";的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试在张量板中显示我的嵌入.当我打开张量板的嵌入选项卡时,我得到:正在计算 PCA..."并且张量板无限挂起.

I'm trying to display my embeddings in tensorboard. When I open embeddings tab of tensorboard I get: "Computing PCA..." and tensorboard hangs infinitely.

在此之前,它确实加载了我的形状为 200x128 的张量.它也确实找到了元数据文件.

Before that it does load my tensor of shape 200x128. It does find the metadata file too.

我在 TF 版本 0.12 和 1.1 上尝试过,结果相同.

I tried that on TF versions 0.12 and 1.1 with the same result.

features = np.zeros(shape=(num_batches*batch_size, 128), dtype=float)
embedding_var = tf.Variable(features, name='feature_embedding')
config = projector.ProjectorConfig()
embedding = config.embeddings.add()
embedding.tensor_name = 'feature_embedding'
metadata_path = os.path.join(self.log_dir, 'metadata.tsv')
embedding.metadata_path = metadata_path

with tf.Session(config=self.config) as sess:
  tf.global_variables_initializer().run()
  restorer = tf.train.Saver()
  restorer.restore(sess, self.pretrained_model_path)

  with open(metadata_path, 'w') as f:

    for step in range(num_batches):
      batch_images, batch_labels = data.next()

        for label in batch_labels:
          f.write('%s\n' % label)

        feed_dict = {model.images: batch_images}
        features[step*batch_size : (step+1)*batch_size, :] = \ 
                    sess.run(model.features, feed_dict)

  sess.run(embedding_var.initializer)
  projector.visualize_embeddings(tf.summary.FileWriter(self.log_dir), config)

推荐答案

我不知道上面的代码有什么问题,但我用不同的方式(下面)重写了它,它有效.不同之处在于 embedding_var 的初始化时间和方式.

I don't know what was wrong in the code above, but I rewrote it in a different way (below), and it works. The difference is when and how the embedding_var is initialized.

我还制作了从中复制粘贴代码的要点.>

I also made a gist to copy-paste code from out of this.

# a numpy array for embeddings and a list for labels
features = np.zeros(shape=(num_batches*self.batch_size, 128), dtype=float)
labels   = []   


# compute embeddings batch by batch
with tf.Session(config=self.config) as sess:
  tf.global_variables_initializer().run()
  restorer = tf.train.Saver()
  restorer.restore(sess, self.pretrained_model)

  for step in range(num_batches):
    batch_images, batch_labels = data.next()

    labels += batch_labels

    feed_dict = {model.images: batch_images}                     
    features[step*self.batch_size : (step+1)*self.batch_size, :] = \
                sess.run(model.features, feed_dict)


# write labels
metadata_path = os.path.join(self.log_dir, 'metadata.tsv')
with open(metadata_path, 'w') as f:
  for label in labels:
    f.write('%s\n' % label)


# write embeddings
with tf.Session(config=self.config) as sess:

  config = projector.ProjectorConfig()
  embedding = config.embeddings.add()
  embedding.tensor_name = 'feature_embedding'
  embedding.metadata_path = metadata_path

  embedding_var = tf.Variable(features, name='feature_embedding')
  sess.run(embedding_var.initializer)
  projector.visualize_embeddings(tf.summary.FileWriter(self.log_dir), config)                  

  saver = tf.train.Saver({"feature_embedding": embedding_var})
  saver.save(sess, os.path.join(self.log_dir, 'model_features'))

这篇关于张量板嵌入挂起“计算 PCA";的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

10-15 21:27