1. Introduction
In this article, we’ll learn how to mock Amazon S3 (Simple Storage Service) to run integration tests for Java applications.
To demonstrate how it works, we’ll create a CRUD (create, read, update, delete) service that uses the AWS SDK to interact with the S3. Then, we’ll write integration tests for each operation using a mocked S3 service.
2. S3 Overview
Amazon Simple Storage Service (S3) is a highly scalable and secure cloud storage service provided by Amazon Web Services (AWS). It uses an object storage model, allowing users to store and retrieve data from anywhere on the web.
The service is accessible by a REST-style API, and AWS provides an SDK for Java applications to perform actions like creating, listing, and deleting S3 buckets and objects.
Next, let’s start creating the Java CRUD service for S3 using the AWS SDK and implement the create, read, update, and delete operations.
3. Demo S3 CRUD Java Service
Before we can start using S3, we need to add a dependency to AWS SDK into our project:
<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>s3</artifactId>
<version>2.20.52</version>
</dependency>
To view the latest version, we can check Maven Central.
Next, we create the S3CrudService class with software.amazon.awssdk.services.s3.S3Client as a dependency:
class S3CrudService {
private final S3Client s3Client;
public S3CrudService(S3Client s3Client) {
this.s3Client = s3Client;
}
// ...
}
Now that we’ve created the service, let’s implement the createBucket(), createObject(), getObject(), and deleteObject() operations by using the S3Client API provided by AWS SDK:
void createBucket(String bucketName) {
// build bucketRequest
s3Client.createBucket(bucketRequest);
}
void createObject(String bucketName, File inMemoryObject) {
// build putObjectRequest
s3Client.putObject(request, RequestBody.fromByteBuffer(inMemoryObject.getContent()));
}
Optional<byte[]> getObject(String bucketName, String objectKey) {
try {
// build getObjectRequest
ResponseBytes<GetObjectResponse> responseResponseBytes = s3Client.getObjectAsBytes(getObjectRequest);
return Optional.of(responseResponseBytes.asByteArray());
} catch (S3Exception e) {
return Optional.empty();
}
}
boolean deleteObject(String bucketName, String objectKey) {
try {
// build deleteObjectRequest
s3Client.deleteObject(deleteObjectRequest);
return true;
} catch (S3Exception e) {
return false;
}
}
Now that we have the S3 operations created, let’s learn how to implement integration tests using a mocked S3 service.
4. Use S3Mock Library for Integration Testing
For this tutorial, we have chosen to use the S3Mock library provided by Adobe under an open-source Apache V2 license. S3Mock is a lightweight server that implements the most commonly used operations of the Amazon S3 API. For the supported S3 operations, we can check the dedicated section in the S3Mock repository readme file.
The library developers recommend running the S3Mock service in isolation, preferably using the provided Docker container.
Following the recommendation, let’s use Docker and Testcontainers to run the S3Mock service for the integration tests.
4.1. Dependencies
Next, let’s add the necessary dependencies to run S3Mock together with Testcontainers:
<dependency>
<groupId>com.adobe.testing</groupId>
<artifactId>s3mock</artifactId>
<version>3.3.0</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.adobe.testing</groupId>
<artifactId>s3mock-testcontainers</artifactId>
<version>3.3.0</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.testcontainers</groupId>
<artifactId>junit-jupiter</artifactId>
<version>1.19.4</version>
<scope>test</scope>
</dependency>
We can check the s3mock, s3mock-testcontainers, junit-jupiter links on Maven Central to view the latest version.
4.2. Setup
As a prerequisite, we must have a running Docker environment to ensure that Test Containers can be started.
When we use the @TestConainers and @Container annotations on the integration test class, the latest Docker image for S3MockContainer is pulled from the registry and started within the local Docker environment:
@Testcontainers
class S3CrudServiceIntegrationTest {
@Container
private S3MockContainer s3Mock = new S3MockContainer("latest");
}
Before running the integration test, let’s create an S3Client instance within the @BeforeEach lifecycle method:
@BeforeEach
void setUp() {
var endpoint = s3Mock.getHttpsEndpoint();
var serviceConfig = S3Configuration.builder()
.pathStyleAccessEnabled(true)
.build();
var httpClient = UrlConnectionHttpClient.builder()
.buildWithDefaults(AttributeMap.builder()
.put(TRUST_ALL_CERTIFICATES, Boolean.TRUE)
.build());
s3Client = S3Client.builder()
.endpointOverride(URI.create(endpoint))
.serviceConfiguration(serviceConfig)
.httpClient(httpClient)
.build();
}
In the setup() method, we initialized an instance of S3Client using the builder offered by the S3Client interface. Within this initialization, we specified configurations for the following parameters:
- endpointOverwrite: This parameter is configured to define the address of the S3 mocked service.
- pathStyleAccessEnabled: We set this parameter to true in the service configuration.
- TRUST_ALL_CERTIFICATES: Additionally, we configured an httpClient instance with all certificates trusted, indicated by setting TRUST_ALL_CERTIFICATES to true.
4.3. Writing Integration Test for the S3CrudService
As we finish with the infrastructure setup, let’s write some integration tests for the S3CrudService operations.
First, let’s create a bucket and verify its successful creation:
var s3CrudService = new S3CrudService(s3Client);
s3CrudService.createBucket(TEST_BUCKET_NAME);
var createdBucketName = s3Client.listBuckets().buckets().get(0).name();
assertThat(TEST_BUCKET_NAME).isEqualTo(createdBucketName);
After successfully creating the bucket, let’s upload a new object in S3.
To do so, first, we generate an array of bytes using FileGenerator, and then the createObject() method saves it as an object in the already created bucket:
var fileToSave = FileGenerator.generateFiles(1, 100).get(0);
s3CrudService.createObject(TEST_BUCKET_NAME, fileToSave);
Next, let’s call the getObject() method with the file name of the already saved file to confirm if the object was indeed saved in S3:
var savedFileContent = s3CrudService.getObject(TEST_BUCKET_NAME, fileToSave.getName());
assertThat(Arrays.equals(fileToSave.getContent().array(), savedFileContent)).isTrue();
Finally, let’s test that the deleteObject() also works as expected. To begin with, we call the deleteObject() method with the bucket name and the targeted filename. Subsequently, we call again the getObject() and check that the result is empty:
s3CrudService.deleteObject(TEST_BUCKET_NAME,fileToSave.getName());
var deletedFileContent = s3CrudService.getObject(TEST_BUCKET_NAME, fileToSave.getName());
assertThat(deletedFileContent).isEmpty();
5. Conclusion
In this tutorial, we learned how to write integration tests that depend on the AWS S3 service by using the S3Mock library to mock a real S3 service.
To demonstrate this, first, we implemented a basic CRUD service that creates, reads, and deletes objects from S3. Then, we implemented the integration tests using the S3Mock library.
As always, the full implementation of this article can be found over on GitHub.