本文介绍了Amazon S3的上传文件超时的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有800KB的JPG文件。我试着上传到S3,并不断得到超时错误。能否请你图什么是错的? 800KB是相当小的上传。

 龙CONTENTLENGTH = NULL;
的System.out.println(从文件\ñ上传一个新的对象,以S3);
尝试 {
    byte []的contentBytes = IOUtils.toByteArray(是);
    CONTENTLENGTH = Long.valueOf(contentBytes.length);
}赶上(IOException异常E){
    System.err.printf(失败而读取%s字节,e.getMessage());
}

ObjectMetadata元=新ObjectMetadata();
metadata.setContentLength(CONTENTLENGTH);

s3.putObject(新PutObjectRequest(bucketName,关键的是,元数据));
 

解决方案

难道IOUtils.toByteArray正在耗尽你的输入流,以便有没有更多的数据被读取它当服务调用时?在这种情况下,stream.reset()会解决这个问题。

但如果你只是上载文件(而不是任意的InputStream),你可以使用AmazonS3.putObject(),它接受一个文件的简单形式,然后你就不会需要计算的内容长度在所有。

http://docs.amazonwebservices.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/s3/AmazonS3.html#putObject(java.lang.String, java.lang.String中,java.io.File中)

这将自动重试的任何这样的网络错误几次。你可以调整多少重试,客户端用一个ClientConfiguration对象实例化。

http://docs.amazonwebservices.com/AWSJavaSDK/latest/javadoc/com/amazonaws/ClientConfiguration.html#setMaxErrorRetry(int)

I have a JPG file with 800KB. I try to upload to S3 and keep getting timeout error.Can you please figure what is wrong? 800KB is rather small for upload.

Long contentLength = null;
System.out.println("Uploading a new object to S3 from a file\n");
try {
    byte[] contentBytes = IOUtils.toByteArray(is);
    contentLength = Long.valueOf(contentBytes.length);
} catch (IOException e) {
    System.err.printf("Failed while reading bytes from %s", e.getMessage());
}

ObjectMetadata metadata = new ObjectMetadata();
metadata.setContentLength(contentLength);

s3.putObject(new PutObjectRequest(bucketName, key, is, metadata));
解决方案

Is it possible that IOUtils.toByteArray is draining your input stream so that there is no more data to be read from it when the service call is made? In that case a stream.reset() would fix the issue.

But if you're just uploading a file (as opposed to an arbitrary InputStream), you can use the simpler form of AmazonS3.putObject() that takes a File, and then you won't need to compute the content length at all.

http://docs.amazonwebservices.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/s3/AmazonS3.html#putObject(java.lang.String, java.lang.String, java.io.File)

This will automatically retry any such network errors several times. You can tweak how many retries the client uses by instantiating it with a ClientConfiguration object.

http://docs.amazonwebservices.com/AWSJavaSDK/latest/javadoc/com/amazonaws/ClientConfiguration.html#setMaxErrorRetry(int)

这篇关于Amazon S3的上传文件超时的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-15 03:11