本文介绍了s3cmd失败的次数太多的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我曾经是一个快乐的s3cmd用户。但是最近,当我试图将一个大的压缩文件(〜7Gig)转移到Amazon S3,我收到此错误:

I used to be a happy s3cmd user. However recently when I try to transfer a large zip file (~7Gig) to Amazon S3, I am getting this error:

$> s3cmd put thefile.tgz s3://thebucket/thefile.tgz

....
  20480 of 7563176329     0% in    1s    14.97 kB/s  failed
WARNING: Upload failed: /thefile.tgz ([Errno 32] Broken pipe)
WARNING: Retrying on lower speed (throttle=1.25)
WARNING: Waiting 15 sec...
thefile.tgz -> s3://thebucket/thefile.tgz  [1 of 1]
       8192 of 7563176329     0% in    1s     5.57 kB/s  failed
ERROR: Upload of 'thefile.tgz' failed too many times. Skipping that file.

我使用的是最新的 s3cmd在Ubuntu

为什么会这样呢?我怎么能解决呢?如果是无法解决的,我可以用什么替代工具?

Why is it so? and how can I solve it? If it is unresolvable, what alternative tool can I use?

推荐答案

在我的情况下,失败的原因是服务器的时间是超前于S3的时间。因为我用GMT + 4在我的服务器(位于美国东),我用亚马逊的美国东储存设施。

In my case the reason of the failure was the server's time being ahead of the S3 time. Since I used GMT+4 in my server (located in US East) and I was using Amazon's US East storage facility.

调整我的服务器,以美国东时间后,问题就消失了。

After adjusting my server to the US East time, the problem was gone.

这篇关于s3cmd失败的次数太多的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-28 13:03