问题描述
我在同一地区有一个 EC2 实例和一个 S3 存储桶.该存储桶包含我的 EC2 实例经常使用的相当大(5-20mb)的文件.
I have an EC2 instance and an S3 bucket in the same region. The bucket contains reasonably large (5-20mb) files that are used regularly by my EC2 instance.
我想以编程方式打开我的 EC2 实例上的文件(使用 python).像这样:
I want to programatically open the file on my EC2 instance (using python). Like so:
file_from_s3 = open('http://s3.amazonaws.com/my-bucket-name/my-file-name')
但是使用http"URL 远程访问文件似乎非常低效,这肯定意味着每次我想使用它时都将文件下载到服务器.
But using a "http" URL to access the file remotely seems grossly inefficient, surely this would mean downloading the file to the server every time I want to use it.
我想知道的是,有没有一种方法可以从我的 EC2 实例本地访问 S3 文件,例如:
What I want to know is, is there a way I can access S3 files locally from my EC2 instance, for example:
file_from_s3 = open('s3://my-bucket-name/my-file-name')
我自己找不到解决方案,任何帮助将不胜感激,谢谢.
I can't find a solution myself, any help would be appreciated, thank you.
推荐答案
无论您做什么,对象都会在幕后从 S3 下载到您的 EC2 实例中.这是无法避免的.
Whatever you do the object will be downloaded behind the scenes from S3 into your EC2 instance. That cannot be avoided.
如果您想将存储桶中的文件视为本地文件,您需要为 FUSE 安装多个 S3 文件系统插件中的任何一个(例如:s3fs-保险丝 ).或者,您可以使用 boto 通过 python 代码轻松访问 S3 对象.
If you want to treat files in the bucket as local files you need to install any one of several S3 filesystem plugins for FUSE (example : s3fs-fuse ). Alternatively you can use boto for easy access to S3 objects via python code.
这篇关于从 EC2 实例本地访问 Amazon S3 Bucket的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!