本文介绍了Spring Boot亚马逊AWS S3存储桶文件下载 - 访问被拒绝的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个自动配置的AWS,Spring Boot应用程序,我正在尝试设置一个只需从Amazon S3中的给定存储桶下载特定文件的端点。我使用AWS控制台从我的计算机上将一个JPEG文件上传到存储桶中 - 现在我正在尝试使用我的Spring Boot API下载该文件。

I have an auto-configured AWS, Spring Boot application, and I'm trying to setup an endpoint that will simply download a particular file from a given bucket in Amazon S3. I uploaded a JPEG file into the bucket from my computer using the AWS console - now I'm trying to download that file using my Spring Boot API.

我正在获取以下错误: com.amazonaws.services.s3.model.AmazonS3Exception:Access Denied(服务:Amazon S3;状态代码:403;错误代码:AccessDenied;

I'm getting the following error: com.amazonaws.services.s3.model.AmazonS3Exception: Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied;

我在AWS控制台上创建了一个用户和一个组(用户在组中);用户/组对S3具有完全访问权限以及管理员访问权限。我下载了访问密钥/密钥对,并且出于测试目的,将密钥直接粘贴到我的application.properties文件中,如下所示(此处未显示密钥,显然:))。

I have created a user and a group (user is in the group) on AWS console; the user/group has full access permissions on S3 as well as administrator access. I downloaded the access-key/secret-key pair and, for testing purposes, literally pasted the keys into my application.properties file as shown below (keys are not shown here, obviously :) ).

我很困惑为什么我仍然被拒绝访问。我一直在寻找和研究这个问题;我似乎无法找到特定于Spring Boot的此问题的解决方案。任何帮助将不胜感激。

I'm confused as to why I'm still getting access denied. I've been searching and working on this for a while; I can't seem to find a solution to this issue that is specific to Spring Boot. Any help would be greatly appreciated.

application.properties:

application.properties:

cloud.aws.credentials.accessKey=myaccesskey
cloud.aws.credentials.secretKey=mysecretkey
cloud.aws.credentials.instanceProfile=false
cloud.aws.stack.auto=false

cloud.aws.region.auto=true
cloud.aws.region.static=myregion

SimpleResourceLoadingBean.java:

SimpleResourceLoadingBean.java:

@RestController
public class SimpleResourceLoadingBean {

    private static Logger log = LoggerFactory.getLogger(HealthMonitorApplication.class);

    @Autowired
    private ResourceLoader resourceLoader;


    @RequestMapping("/getresource")
    public String resourceLoadingMethod() throws IOException {
        log.info("IN RESOURCE LOADER");

        Resource resource = this.resourceLoader.getResource("s3://s3.amazonaws.com/mybucket/myfile.ext");

        InputStream inputStream = resource.getInputStream();

        return inputStream.toString();
    }
}

pom.xml(只是与之相关的依赖关系)问题)

pom.xml (Just the dependencies that are relevant to the question)

        <dependency>
            <groupId>org.springframework.cloud</groupId>
            <artifactId>spring-cloud-starter-aws</artifactId>
            <version>1.1.0.RELEASE</version>
        </dependency>
        <dependency>
            <groupId>org.springframework.cloud</groupId>
            <artifactId>spring-cloud-aws-autoconfigure</artifactId>
            <version>1.1.0.RELEASE</version>
        </dependency>


推荐答案

找出解决方案。除了application.properties配置之外,我还必须创建一个配置类,以便在提供适当的凭据时允许我访问AmazonS3Client对象。我在GitHub上关注了这个例子:

Figured out the solution. Besides the application.properties configuration, I had to create a configuration class that would give me access to an AmazonS3Client object when provided the appropriate credentials. I followed this example on GitHub:

AWSConfiguration.java:

AWSConfiguration.java:

import com.amazonaws.auth.AWSCredentials;
import com.amazonaws.auth.BasicAWSCredentials;
import com.amazonaws.regions.Region;
import com.amazonaws.regions.Regions;
import com.amazonaws.services.s3.AmazonS3Client;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;

@Configuration
public class AWSConfiguration {

    @Value("${cloud.aws.credentials.accessKey}")
    private String accessKey;

    @Value("${cloud.aws.credentials.secretKey}")
    private String secretKey;

    @Value("${cloud.aws.region}")
    private String region;

    @Bean
    public BasicAWSCredentials basicAWSCredentials() {
        return new BasicAWSCredentials(accessKey, secretKey);
    }

    @Bean
    public AmazonS3Client amazonS3Client(AWSCredentials awsCredentials) {
        AmazonS3Client amazonS3Client = new AmazonS3Client(awsCredentials);
        amazonS3Client.setRegion(Region.getRegion(Regions.fromName(region)));
        return amazonS3Client;
    }
}

配置完成后,您可以创建AmazonS3Client对象( autowired)在您的其他类中,并使用客户端向您的S3云发出请求。该示例使用包装类作为服务,以便于实现其他控制器类。

Once this is configured, you can create AmazonS3Client objects (autowired) in your other classes, and use the client to make requests to your S3 cloud. The example uses a wrapper class as a service in order to ease the implementation of additional controller classes.

S3Wrapper.java:

S3Wrapper.java:

import com.amazonaws.services.s3.AmazonS3Client;
import com.amazonaws.services.s3.model.*;
import org.apache.commons.io.IOUtils;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.http.HttpHeaders;
import org.springframework.http.HttpStatus;
import org.springframework.http.MediaType;
import org.springframework.http.ResponseEntity;
import org.springframework.stereotype.Service;
import org.springframework.util.StringUtils;
import org.springframework.web.multipart.MultipartFile;

import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.io.InputStream;
import java.net.URLEncoder;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;

@Service
public class S3Wrapper {

    @Autowired
    private AmazonS3Client amazonS3Client;

    @Value("${cloud.aws.s3.bucket}")
    private String bucket;

    private PutObjectResult upload(String filePath, String uploadKey) throws FileNotFoundException {
        return upload(new FileInputStream(filePath), uploadKey);
    }

    private PutObjectResult upload(InputStream inputStream, String uploadKey) {
        PutObjectRequest putObjectRequest = new PutObjectRequest(bucket, uploadKey, inputStream, new ObjectMetadata());

        putObjectRequest.setCannedAcl(CannedAccessControlList.PublicRead);

        PutObjectResult putObjectResult = amazonS3Client.putObject(putObjectRequest);

        IOUtils.closeQuietly(inputStream);

        return putObjectResult;
    }

    public List<PutObjectResult> upload(MultipartFile[] multipartFiles) {
        List<PutObjectResult> putObjectResults = new ArrayList<>();

        Arrays.stream(multipartFiles)
                .filter(multipartFile -> !StringUtils.isEmpty(multipartFile.getOriginalFilename()))
                .forEach(multipartFile -> {
                    try {
                        putObjectResults.add(upload(multipartFile.getInputStream(), multipartFile.getOriginalFilename()));
                    } catch (IOException e) {
                        e.printStackTrace();
                    }
                });

        return putObjectResults;
    }

    public ResponseEntity<byte[]> download(String key) throws IOException {
        GetObjectRequest getObjectRequest = new GetObjectRequest(bucket, key);

        S3Object s3Object = amazonS3Client.getObject(getObjectRequest);

        S3ObjectInputStream objectInputStream = s3Object.getObjectContent();

        byte[] bytes = IOUtils.toByteArray(objectInputStream);

        String fileName = URLEncoder.encode(key, "UTF-8").replaceAll("\\+", "%20");

        HttpHeaders httpHeaders = new HttpHeaders();
        httpHeaders.setContentType(MediaType.APPLICATION_OCTET_STREAM);
        httpHeaders.setContentLength(bytes.length);
        httpHeaders.setContentDispositionFormData("attachment", fileName);

        return new ResponseEntity<>(bytes, httpHeaders, HttpStatus.OK);
    }

    public List<S3ObjectSummary> list() {
        ObjectListing objectListing = amazonS3Client.listObjects(new ListObjectsRequest().withBucketName(bucket));

        List<S3ObjectSummary> s3ObjectSummaries = objectListing.getObjectSummaries();

        return s3ObjectSummaries;
    }
}

注意:需要添加以下依赖项pom.xml以便使用Apache Commons IO库。

Note: The following dependency will need to be added to pom.xml in order to use the Apache Commons IO library.

pom.xml:

<dependency>
    <groupId>org.apache.commons</groupId>
    <artifactId>commons-io</artifactId>
    <version>1.3.2</version>
</dependency>

这篇关于Spring Boot亚马逊AWS S3存储桶文件下载 - 访问被拒绝的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-13 15:39