This post will be included in a chapter from a forthcoming book Building Microservices for the Cloud, A guide for busy Java developers I'm writing. If you enter your email address here, I'll send you the previous chapters as well as each new chapter when it gets published!
This post has been featured on http://baeldung.com/java-web-weekly-137 and https://www.petrikainulainen.net/weekly/java-testing-weekly-32-2016.
Spring Cloud Series
- Developing Microservices using Spring Boot, Jersey, Swagger and Docker
- Integration Testing using Spring Boot, Postgres and Docker (you are here)
- Services registration and discovery using Spring Cloud Netflix Eureka Server and client-side load-balancing using Ribbon and Feign
- Centralized and versioned configuration using Spring Cloud Config Server and Git
- Routing requests and dynamically refreshing routes using Spring Cloud Zuul Server
- Microservices Sidecar pattern implementation using Postgres, Spring Cloud Netflix and Docker
- Implementing Circuit Breaker using Hystrix, Dashboard using Spring Cloud Turbine Server (work in progress)
1. INTEGRATION TESTING USING SPRING BOOT, POSTGRES AND DOCKER
The integration testing goal is to verify that interaction between different parts of the system work well.
Consider this simple example:
Successful run of this integration test validates that:
- Class attribute actorDao is found in the dependency injection container.
- If there are multiple implementations of ActorDao interface, the dependency injection container is able to sort out which one to use.
- Credentials needed to communicate with the backend database are correct.
- Actor class attributes are correctly mapped to database column names.
- actor table has exactly 200 rows.
This trivial integration test is taking care of possible issues that unit testing wouldn’t be able to find. It comes at a cost though, a backend database needs to be up and running. If resources used by the integration test also includes a message broker or text-based search engine, instances of such services would need to be running and reachable. As can be seen, extra effort is needed to provision and maintain VMs / Servers / … for integration tests to interact with.
In this blog entry, a continuation of the Spring Cloud Series, I’ll show and explain how to implement integration testing using Spring Boot, Postgres and Docker to pull Docker images, start container(s), run the DAOs-related tests using one or multiple Docker containers and dispose them once the tests are completed.
- 1. INTEGRATION TESTING USING SPRING BOOT, POSTGRES AND DOCKER
- 2. REQUIREMENTS
- 3. THE DOCKER IMAGES
- 4. GENERATING JPA ENTITIES FROM DATABASE SCHEMA
- 5. TESTEXECUTIONLISTENER SUPPORTING CODE
- 6. ACTORDAOIT INTEGRATION TEST CLASS
- 7. MAVEN PLUGIN CONFIGURATION
- 8. RUNNING FROM COMMAND LINE
- 9. RUNNING FROM ECLIPSE
- 10. SOURCE CODE
- 11. REFERENCES
- Java 7 or 8.
- Maven 3.2+.
- Familiarity with Spring Framework.
- Docker host.
3. THE DOCKER IMAGES
I’m going to start by building a couple of Docker images, first a base Postgres Docker image then a DVD rental DB Docker image that extends from the base image and integration tests will connect to once a container is started.
3.1 BASE POSTGRES DOCKER IMAGE
This image extends the official Postgres image included in Docker hub and attempts to create a database passing environmental variables to the run command.
Here is a snippet from its Dockerfile:
Shell or SQL files included in /docker-entrypoint-initdb.d directory will be automatically run during container startup. That leads to the execution of db-init.sh:
This script basically takes care of creating a Postgres role and a database and grants database permissions to newly created user in case they don’t exist. Database and role information is passed through environment variables as shown next:
Now that we have a Postgres image with the database and role used to connect to it already created, there are a couple of options on how to create and setup the database schema:
- The 1st option would be, no other Docker image is needed since we already have a database and credentials and the integration test itself (during its life cycle) would take care of setting the schema up, let’s say by using a combination of Spring’s SqlScriptsTestExecutionListener and @Sql annotation.
- The 2nd option would be to provide an image with the database schema already setup, meaning the database already includes tables, views, triggers, functions as well as the seeded data.
No matter which option is chosen, It is my opinion that integration tests:
- Should not impose tests execution order.
- Application external resources should closely match the resources used in production environment. (DB servers, Message Brokers, Search engines, …)
- Should start from a known state, meaning it would be preferable for every test to have the same seeded data, this being a result of meeting bullet #1.
This is not a one size fits all solution, it should depends on particular needs, if for instance, the database used by the application is an in-memory product, it might be faster to use the same container and drop-create the tables before each test starts instead of starting a new container for every test.
I decided to go with the 2nd option, creating a Docker image with the schema and seeded data setup. In this example each integration test is going to start a new container, where the schema doesn’t need to be created and data, which might be a lot of it, doesn’t need to be seeded during container startup. The next section covers this option.
3.2 DVD RENTAL DB POSTGRES DOCKER IMAGE
Let’s take a look at relevant commands of its Dockerfile:
It sets environment variables with DB name and credentials, includes a dump of the database and a script to restore DB from dump is copied to /docker-entrypoint-initdb.d directory, which as I mentioned earlier, will execute Shell and SQL scripts found in it.
The script to restore from a Postgres dump looks like:
Dockerfileis located and run:
docker build -t asimio/postgres:latest .
Then go to where asimio/db_dvdrental’s
Dockerfileis found and run:
docker build -t asimio/db_dvdrental:latest .
4. GENERATING JPA ENTITIES FROM DATABASE SCHEMA
I’ll use a maven plugin to generate the JPA entities, they don’t include @Id generation strategy, but it’s a very good starting point that saves a lot of time to manually create the POJOs. Relevant section in pom.xml looks like:
The plugin configuration references db_dvdrental.reveng.xml, which includes the schema we would like to use for the reverse engineering task of generating the POJOs:
And it also references db_dvdrental.hibernate.properties which includes the JDBC connection properties to connect to the database to read the schema from:
At this point all we need is start a Docker container with db_dvdrental DB setup and run a Maven command to generate the POJOs. To start the container just run this command:
In case DOCKER_HOST is set in your environment run the following Maven command as is, otherwise hardcode docker.host to the IP where the Docker host could be found:
JPA entities should have been generated at target/generated-sources/hibernate3, Resulting package needs to be copied to src/main/java.
5. TESTEXECUTIONLISTENER SUPPORTING CODE
A Spring’s TestExecutionListener implementation hooks into the integration test life cycle to provision Docker containers before it gets executed. This is a snippet of such implementation:
This class methods beforeTestClass(), prepareTestInstance() and afterTestClass() are used to manage the Docker containers life cycle:
- beforeTestClass(): gets executed only once, before the first test. Its purpose is to pull the Docker image and depending on whether the test class will re-use the same running container or not, it might also start a container.
- prepareTestInstance(): gets called before the next test method is run. Its purpose is to start a new container depending on if each test method requires it, otherwise the same running container started in beforeTestClass() will be re-used.
- afterTestClass(): gets executed only once, after all tests have been executed. Its purpose is to stop and remove running containers.
Why aren’t I implementing this functionality in beforeTestMethod() and afterTestMethod() TestExecutionListener’s methods if judging by their names seem more suitable? The problem with these methods rely on how I’m passing information back to Spring for loading the application context.
To prevent hard-coding any port mapped from the Docker container to the Docker host, I decided to use random ports, for instance, in the demo, I’m using a Postgres container that internally listens on port 5432, but it needs to be mapped to a host port for other applications to connect to the database, this host port is chosen randomly and put in the JVM System properties as shown line 90. It might end up with a System property like:
and that lead us to the next section.
6. ACTORDAOIT INTEGRATION TEST CLASS
ActorDaoIT is not that complex, let’s take a look at it:
The interesting parts are @TestExecutionListeners, @DockerConfig and @DirtiesContext annotations used for configuration.
- DockerizedTestExecutionListener discussed earlier is configured through @DockerConfig with information about the Docker image, its name and tag, where will it be pulled from and container ports that will be exposed.
- DependencyInjectionTestExecutionListener is used so that actorDao is injected and available for the test to run.
- DirtiesContextTestExecutionListener is used with @DirtiesContext annotation to cause the Spring to reload the application context after each test in the class is executed.
The reason for reloading the application context, as done in this demo, is because the JDBC url changes depending on the Docker host mapped container ports discussed at the end of the previous section. Lets look at the properties file used to build the data source bean:
Noticed HOST_PORT_FOR_5432 placeholder? After the DockerizedTestExecutionListener starts the Postgres DB Docker container, it adds a System property named HOST_PORT_FOR_5432 with a random value. When is time for the Spring JUnit runner to load the application context, it successfully replaces the placeholder found in the yaml file with the available property value. This only happens because the Docker life cycle is managed in DockerizedTestExecutionListener’s beforeTestClass() and prepareTestInstance(), where the application context hasn’t loaded yet, as is the case with beforeTestMethod().
If I were to use the same running container for each test, there wouldn’t be any need to reload the application context and DirtiesContext-related listener and annotation could be removed and @DockerConfig’s startMode could be set to ContainerStartMode.ONCE so that Docker container is started only once through DockerizedTestExecutionListener’s beforeTestClass().
Now that actorDao bean is created and available, each individual integration test executes as usual.
There is still another placeholder in the yaml file, docker.host will be addressed in the next sections.
7. MAVEN PLUGIN CONFIGURATION
Following naming convention, integration test was named using IT as suffix, for instance, ActorDaoIT, this means maven-surefire-plugin won’t execute it during the test phase, so maven-failsafe-plugin was used instead. Relevant section from pom.xml includes:
8. RUNNING FROM COMMAND LINE
In addition to JAVA_HOME, M2_HOME, PATH environment variables, there are a few more that need to be set (in ~/.bashrc for instance) since they are used by Spotify’s docker-client in DockerizedTestExecutionListener.
Once these variables have been sourced, demo could be built and tested using:
docker.host is being passed as VM argument (the DOCKER_HOST IP only) and replaced when Spring JUnit runner creates the application context for each test.
9. RUNNING FROM ECLIPSE
As in previous section, DOCKER_* environment variables and docker.host VM argument need to be passed to the test class in Eclipse, the way to accomplish so is to set them in the Run Configurations dialog -> Environment tab:
and Run Configurations dialog -> Arguments tab
and run the JUnit test as usual.
That’s all, enjoy! I’ll appreciate feedback and I’ll be glad to address it. If you found this post helpful and would like to receive updates when content like this is published, sign up to the newsletter.
10. SOURCE CODE
Accompanying source code for this blog post can be found at:
- Postgres base and db_dvdrental DB Docker images repos
- Integration Testing Spring Boot Postgres Docker source code
This website includes affiliate links to Udemy, Amazon, Google. This means if you click an affiliate link and buy a product, I might earn a commission at no extra cost to you.