输出是否适用于Cloud

输出是否适用于Cloud

本文介绍了Google Storage(gs)包装文件输入/输出是否适用于Cloud ML?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

Google最近宣布了Clould ML, https://cloud.google.com/ml/这非常有用.但是,一个限制是Tensorflow程序的输入/输出应支持gs://.

Google recently announced the Clould ML, https://cloud.google.com/ml/ and it's very useful. However, one limitation is that the input/out of a Tensorflow program should support gs://.

如果我们使用所有tensorflow APIS读取/写入文件,则应该可以,因为这些API支持gs://.

If we use all tensorflow APIS to read/write files, it should OK, since these APIs support gs://.

但是,如果我们使用诸如open之类的本机文件IO API,它将无法正常工作,因为它们不了解gs://

However, if we use native file IO APIs such as open, it does not work, because they don't understand gs://

例如:

 with open(vocab_file, 'wb') as f:
        cPickle.dump(self.words, f)

此代码在Google Cloud ML中不起作用.

This code won't work in Google Cloud ML.

但是,将所有本机文件IO API修改为tensorflow API或Google Storage Python API确实很繁琐.有没有简单的方法可以做到这一点?在本地文件IO顶部,是否有任何包装器支持Google存储系统gs://?

However, modifying all native file IO APIs to tensorflow APIs or Google Storage Python APIs is really tedious. Is there any simple way to do this? Any wrappers to support google storage systems, gs:// on top of the native file IO?

如此处建议腌制的稀疏稀疏矩阵作为输入数据? ,也许我们可以使用file_io.read_file_to_string('gs://...'),但是仍然需要大量的代码修改.

As suggested here Pickled scipy sparse matrix as input data?, perhaps we can use file_io.read_file_to_string('gs://...'), but still this requrements significant code modifcation.

推荐答案

一种解决方案是在程序启动时将所有数据复制到本地磁盘.您可以在运行的Python脚本中使用gsutil进行操作,例如:

One solution is to copy all of the data to local disk when the program starts up. You can do that using gsutil inside the Python script that gets run, something like:

vocab_file = 'vocab.pickled'
subprocess.check_call(['gsutil', '-m' , 'cp', '-r',
                       os.path.join('gs://path/to/', vocab_file), '/tmp'])

with open(os.path.join('/tmp', vocab_file), 'wb') as f:
  cPickle.dump(self.words, f)

如果有任何输出,则可以将其写入本地磁盘,并gsutil rsync. (但是,请小心正确处理重启,因为您可能会被置于另一台计算机上.)

And if you have any outputs, you can write them to local disk and gsutil rsync them. (But, be careful to handle restarts correctly, because you may be put on a different machine).

另一种解决方案是对猴子补丁open(注意:未经测试):

The other solution is to monkey patch open (Note: untested):

import __builtin__

# NB: not all modes are compatible; should handle more carefully.
# Probably should be reported on
# https://github.com/tensorflow/tensorflow/issues/4357
def new_open(name, mode='r', buffering=-1):
  return file_io.FileIO(name, mode)

__builtin__.open = new_open

在任何模块实际尝试从GCS读取数据之前,一定要确保这样做.

Just be sure to do that before any module actually tries to read from GCS.

这篇关于Google Storage(gs)包装文件输入/输出是否适用于Cloud ML?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-14 00:14