Tutorial: Spring Boot Integration with AWS S3

Amazon S3 (Simple Storage Service) is one of the most popular object storage services offered by AWS. It is widely used for storing images, videos, and documents. Integrating AWS S3 into a Spring Boot application allows developers to efficiently perform CRUD (Create, Read, Update, Delete) operations. This article provides a step-by-step guide on how to build a Spring Boot application that integrates AWS S3 for managing files.

Prerequisites

Before we dive into the implementation, ensure developers have the following setup:

  1. AWS Account: Create an AWS account and set up an S3 bucket.
  2. AWS Credentials: Generate access and secret keys for your AWS user.
  3. Spring Boot Setup: A basic Spring Boot project configured with Maven or Gradle.
  4. IDE: An Integrated Development Environment (e.g., IntelliJ IDEA or Eclipse).
  5. Java 11 or higher is installed on your system.

Create Buckets

1. Go to the Amazon console AWS S3.

https://ap-southeast-1.console.aws.amazon.com/console/home?region=ap-southeast-1#
AWS console home
AWS S3

2. Click the button Create bucket

Create bucket

3. Input Bucket name and click the button Create bucket

Confirm Create bucket

4. AWS S3 shows a bucket list.

Bucket list

Create IAM (Identity and Access Management)

1. Log in to the AWS Management Console

1. Navigate to the AWS Management Console.

https://ap-southeast-1.console.aws.amazon.com/singlesignon/home?region=ap-southeast-1#!/

2. Go to the IAM (Identity and Access Management) service.

AWS Management Console

2. Create or Select an IAM User

1. For a New IAM User:

  • Enable IAM Identity Center.
  • In the IAM Console, click Users > Add Users.
  • Provide a username (e.g., s3-access-user).
  • Select Programmatic Access to enable the creation of an access key and secret key.
Add Users
Specify user details
Review and add user
Success add user

2. For an Existing IAM User:

  • Select an existing user from the Users list.

3. Attach Policies or Permissions

1. Assign appropriate permissions to the user for S3.

  • Use a managed policy like AmazonS3FullAccess (for complete S3 access).
  • Alternatively, use AmazonS3ReadOnlyAccess if only read access is needed.
  • For fine-grained control, create a custom policy.
Central management

Example Policy for Full Access to a Specific Bucket:

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Action": "s3:*",
      "Resource": [
        "arn:aws:s3:::your-bucket-name",
        "arn:aws:s3:::your-bucket-name/*"
      ]
    }
  ]
}

4. Generate the Access Key and Secret Key

1. Go to the Security Credentials tab for the user.

2. Click Create Access Key.

  • AWS will generate an Access Key ID and Secret Access Key.
  • Download the credentials or copy them securely. Developers won’t be able to view the Secret Key again.
IAM
Security Credentials
Access Key
Create access key
Access Key ID and Secret Access Key

5. Secure the Credentials

1. Do not share the credentials.

2. Store the credentials in a secure location, such as:

  • AWS CLI configuration.
  • Environment variables.
  • Secrets Manager or Vault services.

Overview of MSP360 and S3 Browser for Managing AWS S3

MSP360 (formerly CloudBerry Explorer) and S3 Browser are graphical user interface (GUI) tools that simplify managing Amazon S3 buckets and objects. These tools cater to users who prefer a GUI over command-line tools like the AWS CLI or SDKs for tasks like uploading, downloading, and organizing files in S3.

MSP360 (CloudBerry Explorer)

Key Features

  • User-Friendly Interface: Dual-pane layout for local files and S3 storage, making file transfers intuitive.
  • Bucket Management: Create, delete, and manage S3 buckets easily.
  • Support for Multiple Cloud Providers: Beyond S3, supports Azure Blob, Google Cloud, and others.
  • Encryption and Compression: Secure your data with client-side encryption and compression before uploading.
  • IAM Role Support: Allows integration with AWS IAM roles for better security and access control.
  • Advanced Search: Quickly find files in large buckets with robust filtering options.
  • Multipart Uploads: Optimize uploads for large files by breaking them into smaller parts.
  • Cost Estimator: Helps estimate AWS S3 storage and transfer costs.

S3 Browser

Key Features

  • Simple and Lightweight: Focused specifically on managing AWS S3 storage.
  • Batch Operations: Supports bulk uploads, downloads, and deletions.
  • Bucket Management: Create and configure buckets, including versioning and lifecycle rules.
  • IAM Role Support: Allows the use of roles for secure access.
  • File Sharing: Generate pre-signed URLs for temporary access to files.
  • Cross-Bucket Copying: Easily move or copy files between buckets.
  • Metadata Management: Edit object metadata like Content-Type, Cache-Control, etc.
  • Access Logs: View bucket usage and access logs.

Connect AWS S3 with MSP360 (CloudBerry Explorer)

1. Click the File and choose Amazon S3

Amazon S3

2. Add a new Amazon S3 Account

Amazon S3 Account

3. Select Amazon S3 Account

4. Connect AWS S3 successfully.

Step-by-Step Guide

Step 1: Add Dependencies

To interact with AWS S3, we must include the AWS SDK in our project. Add the following dependency to your pom.xml file:

<dependency>
    <groupId>software.amazon.awssdk</groupId>
    <artifactId>s3</artifactId>
    <version>2.29.17</version> <!-- Use the latest version -->
</dependency>

Step 2: Configure AWS Credentials

Create a application.yml file (or application.properties) to store your AWS configuration.

aws:
  s3:
    bucket-name: your-s3-bucket-name
  region: us-east-1
  access-key: your-access-key
  secret-key: your-secret-key

Ensure that the credentials and bucket name match your AWS configuration.

Step 3: Set Up AWS S3 Client

The AWS SDK provides an S3Client class for interacting with S3. Create a configuration class to set it up:

import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import software.amazon.awssdk.auth.credentials.AwsBasicCredentials;
import software.amazon.awssdk.auth.credentials.StaticCredentialsProvider;
import software.amazon.awssdk.regions.Region;
import software.amazon.awssdk.services.s3.S3Client;

@Configuration
public class AwsS3Config {

    @Value("${aws.access-key}")
    public String accessKey;

    @Value("${aws.secret-key}")
    public String secretKey;

    @Value("${aws.region}")
    public String region;
    
    @Bean
    public S3Client s3Client() {
        
        return S3Client.builder()
                .region(Region.of(region)) // Replace with your region
                .credentialsProvider(
                        StaticCredentialsProvider.create(
                                AwsBasicCredentials.create(
                                        accessKey,
                                        secretKey
                                )
                        )
                )
                .build();
    }
}

Step 4: Implement CRUD Operations

1. Create (Upload) Files

Create a service class to handle file uploads.

import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Service;
import org.springframework.web.multipart.MultipartFile;
import software.amazon.awssdk.services.s3.S3Client;
import software.amazon.awssdk.services.s3.model.PutObjectRequest;

import java.io.IOException;
import java.nio.file.Paths;

@Service
public class S3Service {

    private final S3Client s3Client;

    @Value("${aws.s3.bucket-name}")
    private String bucketName;

    public S3Service(S3Client s3Client) {
        this.s3Client = s3Client;
    }

    public String uploadFile(MultipartFile file) throws IOException {
        String key = file.getOriginalFilename();

        Path tempFile = Files.createTempFile("upload-", file.getOriginalFilename());
        Files.copy(file.getInputStream(), tempFile, StandardCopyOption.REPLACE_EXISTING);

        PutObjectRequest putObjectRequest = PutObjectRequest.builder()
                .bucket(bucketName)
                .key(key)
                .contentType(file.getContentType())
                .build();

        s3Client.putObject(putObjectRequest, tempFile);

        return key;
    }
}

2. Read (Download) Files

Add a method to retrieve files from the S3 bucket.

import software.amazon.awssdk.services.s3.model.GetObjectRequest;

import java.io.InputStream;

public InputStream downloadFile(String key) {
    return s3Client.getObject(
            GetObjectRequest.builder()
                    .bucket(bucketName)
                    .key(key)
                    .build()
    );
}

3. Update Files

Updating files in S3 typically involves overwriting an existing object with a new one. Developers can reuse the uploadFile method for this purpose.

4. Delete Files

Add a method to delete files from the S3 bucket.

import software.amazon.awssdk.services.s3.model.DeleteObjectRequest;

public void deleteFile(String key) {
    s3Client.deleteObject(
            DeleteObjectRequest.builder()
                    .bucket(bucketName)
                    .key(key)
                    .build()
    );
}

Step 5: Build REST API

Create a controller to expose the CRUD operations as REST endpoints.

import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*;
import org.springframework.web.multipart.MultipartFile;

import java.io.IOException;

@RestController
@RequestMapping("/api/files")
public class S3Controller {

    private final S3Service s3Service;

    public S3Controller(S3Service s3Service) {
        this.s3Service = s3Service;
    }

    @PostMapping("/upload")
    public ResponseEntity<String> uploadFile(@RequestParam("file") MultipartFile file) throws IOException {
        String key = s3Service.uploadFile(file);
        return ResponseEntity.ok("File uploaded successfully with key: " + key);
    }

    @GetMapping("/download/{key}")
    public ResponseEntity<byte[]> downloadFile(@PathVariable String key) throws IOException {
        try (InputStream inputStream = s3Service.downloadFile(key)) { // Auto-closes InputStream
            byte[] content = inputStream.readAllBytes();
            return ResponseEntity.ok(content);
        } catch (IOException e) {
            return ResponseEntity.status(500).body(null); // Handle errors appropriately
        }
    }

    @DeleteMapping("/delete/{key}")
    public ResponseEntity<String> deleteFile(@PathVariable String key) {
        s3Service.deleteFile(key);
        return ResponseEntity.ok("File deleted successfully with key: " + key);
    }
}

Step 6: Test the Application

Using Postman to test web services is essential to API development and debugging. Postman is a powerful tool that lets developers test RESTful APIs with ease.

1. Upload a File

Use a tool like Postman to send a POST request to /api/files/upload with a file.

Request
Reponse

Check the upload file in the AWS S3 bucket.

example.txt in AWS S3 bucket

2. Download a File

Send a GET request to /api/files/download/{key} with the file’s key.

Request

3. Delete a File

Send a DELETE request to /api/files/delete/{key}.

Request
Response

Check the example.txt deleted file in the AWS S3 bucket.

Refresh
example.txt deleted

Step 7: Handle Exceptions

Add exception handling to your service and controller for better error messages. For example:

@ResponseStatus(HttpStatus.NOT_FOUND)
public class FileNotFoundException extends RuntimeException {
    public FileNotFoundException(String message) {
        super(message);
    }
}

Conclusion

This tutorial demonstrated how to integrate AWS S3 into a Spring Boot application for performing CRUD operations. Following the steps outlined, developers can upload, download, update, and delete files efficiently using the AWS SDK.

For further enhancements, consider:

  • Implementing caching for frequently accessed files.
  • Using AWS S3 features like versioning and lifecycle policies.
  • Securing endpoints with Spring Security.

With this knowledge, developers can handle object storage in modern cloud-based applications!

This article was originally published on Medium.

Leave a Comment

Your email address will not be published. Required fields are marked *