Test AWS cloud stack offline with Arquillian and LocalStack
When you are building your applications on AWS cloud stack (such as DynamoDB, S3, …), you need to write tests against these components. The first idea you might have is to have one environment for production and another one for testing, and run tests against it.
This is fine for integration tests, deployment tests, end to end tests or performance tests, but for component tests it will be faster if you could run AWS cloud stack locally and offline.
Localstack provides this feature. It provides a fully functional local AWS cloud stack so you can develop and test your cloud applications offline.
Localstack comes with different ways to start all stack, but the easiest one is by using Docker image. So if you run
atlassianlabs/localstack then you get the stack up and running with next configuration:
- API Gateway at http://localhost:4567
- Kinesis at http://localhost:4568
- DynamoDB at http://localhost:4569
- DynamoDB Streams at http://localhost:4570
- Elasticsearch at http://localhost:4571
- S3 at http://localhost:4572
- Firehose at http://localhost:4573
- Lambda at http://localhost:4574
- SNS at http://localhost:4575
- SQS at http://localhost:4576
- Redshift at http://localhost:4577
- ES (Elasticsearch Service) at http://localhost:4578
- SES at http://localhost:4579
- Route53 at http://localhost:4580
- CloudFormation at http://localhost:4581
- CloudWatch at http://localhost:4582
So the next question is how do you automate all the process of starting the container, run the tests and finally stop everything and make it portable, so you don’t need to worry if you are using Docker in Linux or MacOS? The answer is using Arquillian Cube.
Arquillian Cube is an Arquillian extension that can be used to manager Docker containers in your tests. To use it you need a Docker daemon running on a computer (it can be local or not), but probably it will be at local.
Arquillian Cube offers three different ways to define container(s):
- Defining a docker-compose file.
- Defining a Container Object.
- Using Container Object DSL.
In this example I am going to show you Container Object DSL approach, but any of the others works as well.
The first thing you need to do is add Arquillian and Arquillian Cube dependencies on your build tool.
<dependencyManagement> <dependencies> <dependency> <groupId>org.arquillian.cube</groupId> <artifactId>arquillian-cube-docker</artifactId> <version>1.6.0</version> </dependency> <dependency> <groupId>org.jboss.arquillian</groupId> <artifactId>arquillian-bom</artifactId> <version>1.1.13.Final</version> <type>pom</type> <scope>import</scope> </dependency> </dependencies> </dependencyManagement> <dependencies> <dependency> <groupId>com.amazonaws</groupId> <artifactId>aws-java-sdk</artifactId> <version>1.11.86</version> </dependency> <dependency> <groupId>org.jboss.arquillian.junit</groupId> <artifactId>arquillian-junit-standalone</artifactId> <scope>test</scope> </dependency> <dependency> <groupId>org.arquillian.cube</groupId> <artifactId>arquillian-cube-docker</artifactId> <scope>test</scope> </dependency> <dependency> <groupId>junit</groupId> <artifactId>junit</artifactId> <version>4.12</version> <scope>test</scope> </dependency> <dependency> <groupId>org.assertj</groupId> <artifactId>assertj-core</artifactId> <version>3.6.2</version> <scope>test</scope> </dependency> </dependencies>
Then you can write the test which in this case tests that you can create a bucket and add some content using the S3 instance started in Docker host:
import com.amazonaws.services.s3.AmazonS3Client; import com.amazonaws.services.s3.model.S3Object; import java.io.ByteArrayInputStream; import java.util.UUID; import java.util.stream.Collectors; import java.util.stream.IntStream; import org.arquillian.cube.docker.impl.client.containerobject.dsl.Container; import org.arquillian.cube.docker.impl.client.containerobject.dsl.DockerContainer; import org.jboss.arquillian.junit.Arquillian; import org.junit.Test; import org.junit.runner.RunWith; import static org.assertj.core.api.Assertions.assertThat; @RunWith(Arquillian.class) public class S3Test { @DockerContainer Container localStack = Container.withContainerName("localstack") .fromImage("atlassianlabs/localstack:0.5.3.1") .withPortBinding(IntStream.rangeClosed(4567, 4578).boxed() .collect(Collectors.toList()).toArray(new Integer[0])) .withPortBinding(8080) .build(); @Test public void should_create_bucket_and_add_content() { final AmazonS3Client amazonS3Client = new AmazonS3Client(); amazonS3Client.setEndpoint("http://" + localStack.getIpAddress() + ":4572/"); String bucketName = "my-first-s3-bucket-" + UUID.randomUUID(); String key = "MyObjectKey"; amazonS3Client.createBucket(bucketName); assertThat(amazonS3Client.listBuckets()).hasSize(1); amazonS3Client.putObject(bucketName, key, "abcdef"); final S3Object object = amazonS3Client.getObject(bucketName, key); assertThat(object.getObjectContent()).hasSameContentAs(new ByteArrayInputStream("abcdef".getBytes())); } }
Important things to take into consideration:
- You annotate your test with Arquillian runner.
- Use @DockerContainer annotation to attribute used to define the container.
- Container Object DSL is just a DSL that allows you to configure the container you want to use. In this case the localstack container with required port binding information.
- The test just connects to Amazon S3 and creates a bucket and stores some content.
Nothing else is required. When you run this test, Arquillian Cube will connect to installed Docker (Machine) host and start the localstack container. When it is up and running and services are able to receive requests, the tests are executed. After that container is stopped and destroyed.
TIP1: If you cannot use Arquillian runner you can also use a JUnit Class Rule to define the container as described here: http://arquillian.org/arquillian-cube/#_junit_rule
TIP2: If you are planning to use localstack in the whole organization, I suggest you to use Container Object approach instead of DSL because then you can pack the localstack Container Object into a jar file and import in all projects you need to use it. You can read at http://arquillian.org/arquillian-cube/#_arquillian_cube_and_container_object
So now you can write tests for your application running on AWS cloud without having to connect to remote hosts, just using local environment.
We keep learning,
Alex
Reference: | Test AWS cloud stack offline with Arquillian and LocalStack from our JCG partner Alex Soto at the One Jar To Rule Them All blog. |