As part of your organization’s modernization effort, your team is writing Spring Boot applications that store data to, and retrieve data from Azure Cosmos DBs instead of relational databases.
Your integration tests might be connecting to a dedicated Cosmos DB hosted in Azure, which increases your organization’s subscription cost.
You could instead be using a dedicated Cosmos DB emulator Docker container you would need to make sure is running, and which ports it listens on, before you seed test data and run every test.
Or you might not have integration tests at all. But you also know integration testing is one of the keys to deploy often and with confidence.
This blog post shows you how to write integration tests with Testcontainers and the Cosmos DB Docker emulator for Spring Boot applications.
Listening on available ports instead of hard-coding them. And importing the Cosmos DB self-signed emulator certificate to a temporal Java truststore your tests could use.
All these automated.
Azure Spring Apps is a platform as a service (PaaS) and one of the services Azure offers for your organization to run Spring Boot applications.
With very little effort, you can provision the Azure Spring Apps infrastructure required to deploy and run Spring Boot Web, RESTful, Batch, etc. applications. This allows your team to spend more time with the applications’ business logic.
This blog post covers provisioning the Azure Spring Apps infrastructure using Azure CLI to deploy a Spring Boot application previously stored in a Maven repository hosted in an Azure Blob Storage container.
Your organization implemented and deployed Spring Boot applications to send emails from Thymeleaf templates.
Let’s say they include reports with confidential content, intellectual property, or sensitive data. Is your organization testing these emails?
How would you verify these emails are being sent to the expected recipients?
How would you assert these emails include the expected data, company logo, and/or file attachments?
This blog post shows you how to write integration tests with GreenMail and Jsoup for Spring Boot applications that send emails.
Your team used Java and Spring Boot to implement Logistics functionality, or a Hotel reservation system, or an e-commerce Shopping Cart, you name it.
Now Business folks would like to receive emails with daily shipment reports, or weekly bookings, or abandoned shopping carts.
Business people would like to get emails with consolidated data to take decisions based on these reports.
Spring Framework, specifically Spring Boot includes seamless integration with template engines to send HTML and/or text emails.
This blog post covers how to send HTML and text emails using Spring Boot and Thymeleaf template engine.
A few days ago I was working on the accompanying source code for:
blog posts, and ran into the same issue a few times.
The previously running
azure-cosmos-emulator Docker container wouldn’t start.
I had to run a new
azure-cosmos-emulator container, but every time I ran a new container, the emulator’s PEM-encoded certificate changed. That meant the Spring Boot application failed to start because it couldn’t connect to the Cosmos DB anymore. The SSL handshake failed.
A manual solution involved running a bash script that deletes the invalid certificate from the TrustStore, and adds a new one generated during the new Docker container start-up process.
That would be fine if you only need to use this approach once or two during the development phase.
But this manual approach won’t work for running Integration Tests as part of your CI/CD pipeline, regardless if an
azure-cosmos-emulator container is already running, or if you rely on Testcontainers to run a new one.
This blog post covers how to programmatically extract an SSL certificate from a secure connection and add it to a TrustStore that you can use in your integration tests, for instance.
Let’s say you are deploying your Spring Boot RESTful applications to Azure.
Some of these Spring Boot applications might have been modernization rewrites to use Cosmos DB instead of a relational database.
You might have even added support to write dynamic Cosmos DB queries using Spring Data Cosmos. And let’s also assume you wrote unit tests for REST controllers, business logic, and utility classes.
Now you need to write integration tests to verify the interaction between different parts of the system work well, including retrieving from, and storing to, a NoSQL database like Cosmos DB.
This blog post shows you how to write a custom Spring’s TestExecutionListener to seed data in a Cosmos database container. Each Spring Boot integration test will run starting from a known Cosmos container state, so that you won’t need to force the tests to run in a specific order.
Let’s say you need to write a RESTful endpoint that takes a number of request parameters, and use them to filter out data from a database.
Some of the request parameters are optional, so you would only include query conditions in each SQL statement depending on the request parameters sent with each request.
You’ll be writing dynamic SQL queries.
I have covered different solutions when the data comes from a relational database:
- Writing dynamic SQL queries using Spring Data JPA repositories and EntityManager
- Writing dynamic SQL queries using Spring Data JPA Specification and Criteria API
- Writing dynamic SQL queries using Spring Data JPA repositories and Querydsl
But what if the data store is not a relational database? What if the data store is a NoSQL database?
More specifically, what if the database is Azure Cosmos DB?
This tutorial teaches you how to extend Spring Data Cosmos for your repositories to access the ReactiveCosmosTemplate so that you can write dynamic Cosmos DB queries.
Let’s say you had to write a Business Service implementation using Spring’s TransactionTemplate instead of the @Transactional annotation.
As a quick example, you need to retrieve a film reviews from external partners using RESTful APIs.
You also need to update this film-related relational database tables rows.
And lastly, you need to publish a JMS message for interested parties to process these changes.
A made up example, but the point is that you need to write a Business Logic that involves sending RESTful API requests to external services, executing multiple SQL statements, and publishing a message to a JMS Queue.
You have already realized that the RESTful API requests and publishing a JMS message shouldn’t be part of the JDBC transaction, so you end up using TransactionTemplate, and a code snippet similar to:
You might also want to store these Docker images in a private Docker registry.
Microsoft’s Azure Container Registry (ACR) is a cheap option to store both, public and private Docker images.
It would even makes more sense to use ACR if your organization is already invested in other Azure services.
Spring Boot application in a Docker image stored in a private Azure Container Registry
This tutorial covers setting up the Azure infrastructure and Maven configuration to push a Docker image to a private ACR repository.
Logging is an important part of application development.
It helps you to troubleshoot issues, to follow execution flows, not only inside the application, but also when spawning multiple requests across different services.
Logging also helps you to capture data and replicate production bugs in a development environment.
Often times, searching logs efficiently is a daunting task. That’s why there are plenty of Log Aggregators such as Splunk, ELK, Datadog, AWS CloudWatch, and many more, that help with capturing, standardizing, and consolidating logs to assist with log indexing, analysis, and searching.
Standard-formatted log messages like:
2023-08-01 12:43:44.421 INFO 73710 --- [ main] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat started on port(s): 8080 (http) with context path ''
are easy to read by Engineers but not so easy to parse by Log Aggregators, especially if the log format keeps changing.
This blog post helps you to configure Spring Boot applications to format log messages as JSON using Slf4j, Logback and Logstash, and having them ready to be fed to Log Aggregators.