问题描述
我有我试图从AWS EC2的Linux移动到S3一个27GB的文件。我已经试过了'S3put命令和S3cmd把'命令两者。有一个测试文件两个工作。无论是工作,大文件。没有错误,给出的命令立即返回,但没有任何反应。
s3cmd把bigfile.tsv S3://bucket/bigfile.tsv
虽然你可以将对象上传到S3与尺寸高达5TB,S3拥有5GB的单个PUT操作的大小限制。
为了装载超过5GB较大的文件(甚至是文件大于100MB),你会想使用S3的多部分上传功能。
(忽略过时的描述5GB的对象限制在上述博客文章。目前的限制是5TB。)
对于Python的博托库支持多部分上传,以及最新的博托软件包括一个s3multiput的命令行工具,它在为你的复杂性照顾,甚至部分并行上传。
I have a 27GB file that I am trying to move from an AWS Linux EC2 to S3. I've tried both the 'S3put' command and the 'S3cmd put' command. Both work with a test file. Neither work with the large file. No errors are given, the command returns immediately but nothing happens.
s3cmd put bigfile.tsv s3://bucket/bigfile.tsv
Though you can upload objects to S3 with sizes up to 5TB, S3 has a size limit of 5GB for an individual PUT operation.
In order to load files larger than 5GB (or even files larger than 100MB) you are going to want to use the multipart upload feature of S3.
(Ignore the outdated description of a 5GB object limit in the above blog post. The current limit is 5TB.)
The boto library for Python supports multipart upload, and the latest boto software includes an "s3multiput" command line tool that takes care of the complexities for you and even parallelizes part uploads.
这篇关于从EC2大文件到S3的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!