Search results
Seeding Cosmos DB Data to run Spring Boot 2 Integration Tests
1. OVERVIEW
Let’s say you are deploying your Spring Boot RESTful applications to Azure.
Some of these Spring Boot applications might have been modernization rewrites to use Cosmos DB instead of relational databases.
You might have even added support to write dynamic Cosmos DB queries using Spring Data Cosmos. And let’s also assume you wrote unit tests for REST controllers, business logic, and utility classes.
Now you need to write integration tests to verify the interaction between different parts of the system work well, including retrieving from, and storing to, a NoSQL database like Cosmos DB.
This blog post shows you how to write a custom Spring’s TestExecutionListener to seed data in a Cosmos database container. Each Spring Boot integration test will run starting from a known Cosmos container state, so that you won’t need to force the tests to run in a specific order.
Writing dynamic Cosmos DB queries using Spring Data Cosmos repositories and ReactiveCosmosTemplate
1. OVERVIEW
Let’s say you need to write a RESTful endpoint that takes a number of request parameters, and use them to filter out data from a database.
Something like:
/api/users?firstName=...&lastName=...
Some of the request parameters are optional, so you would only include query conditions in each SQL statement depending on the request parameters sent with each request.
You’ll be writing dynamic SQL queries.
I have covered different solutions when the data comes from a relational database:
- Writing dynamic SQL queries using Spring Data JPA repositories and EntityManager
- Writing dynamic SQL queries using Spring Data JPA Specification and Criteria API
- Writing dynamic SQL queries using Spring Data JPA repositories and Querydsl
But what if the data store is not a relational database? What if the data store is a NoSQL database?
More specifically, what if the database is Azure Cosmos DB?
This tutorial teaches you how to extend Spring Data Cosmos for your repositories to access the ReactiveCosmosTemplate so that you can write dynamic Cosmos DB queries.
Unit testing Spring's TransationTemplate, TransactionCallback with JUnit and Mockito
1. OVERVIEW
Let’s say you had to write a Business Service implementation using Spring’s TransactionTemplate instead of the @Transactional annotation.
As a quick example, you need to retrieve a film reviews from external partners using RESTful APIs.
You also need to update this film-related relational database tables rows.
And lastly, you need to publish a JMS message for interested parties to process these changes.
A made up example, but the point is that you need to write a Business Logic that involves sending RESTful API requests to external services, executing multiple SQL statements, and publishing a message to a JMS Queue.
You have already realized that the RESTful API requests and publishing a JMS message shouldn’t be part of the JDBC transaction, so you end up using TransactionTemplate, and a code snippet similar to:
Pushing Spring Boot 2 Docker images to Microsoft ACR
1. OVERVIEW
Google’s jib-maven-plugin, Spotify’s docker-maven-plugin, and spring-boot-maven-plugin since Spring Boot 2.3 help you to build Docker images for your Spring Boot applications.
You might also want to store these Docker images in a private Docker registry.
Microsoft’s Azure Container Registry (ACR) is a cheap option to store both, public and private Docker images.
It would even makes more sense to use ACR if your organization is already invested in other Azure services.
Spring Boot application in a Docker image stored in a private Azure Container Registry
This tutorial covers setting up the Azure infrastructure and Maven configuration to push a Docker image to a private ACR repository.
Configuring JSON-Formatted Logs in Spring Boot applications with Slf4j, Logback and Logstash
1. OVERVIEW
Logging is an important part of application development.
It helps you to troubleshoot issues, to follow execution flows, not only inside the application, but also when spawning multiple requests across different services.
Logging also helps you to capture data and replicate production bugs in a development environment.
Often times, searching logs efficiently is a daunting task. That’s why there are plenty of Log Aggregators such as Splunk, ELK, Datadog, AWS CloudWatch, and many more, that help with capturing, standardizing, and consolidating logs to assist with log indexing, analysis, and searching.
Standard-formatted log messages like:
2023-08-01 12:43:44.421 INFO 73710 --- [ main] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat started on port(s): 8080 (http) with context path ''
are easy to read by Engineers but not so easy to parse by Log Aggregators, especially if the log format keeps changing.
This blog post helps you to configure Spring Boot applications to format log messages as JSON using Slf4j, Logback and Logstash, and having them ready to be fed to Log Aggregators.
Uploading JaCoCo Code Coverage Reports to SonarQube
1. OVERVIEW
SonarQube is a widely adopted tool that collects, analyses, aggregates and reports the source code quality of your applications.
It helps teams to measure the quality of the source code as your applications mature.
SonarQube integrates with popular CI/CD tools so that you can get source code quality reports every time a new application build is triggered; helping teams to fix errors, reduce technical debt, maintain a clean source base, etc.
A previous blog post covered how to generate code coverage reports using Maven and JaCoCo.
This blog post covers how to generate JaCoCo code coverage reports and upload them to SonarQube.
Writing dynamic SQL queries using Spring Data JPA repositories, Hibernate and Querydsl
1. OVERVIEW
Let’s say you need to implement a RESTful endpoint where some or all of the request parameters are optional.
An example of such endpoint looks like:
/api/films?minRentalRate=0.5&maxRentalRate=4.99&releaseYear=2006&category=Horror&category=Action
Let’s also assume you need to retrieve the data from a relational database.
Processing these requests will translate to dynamic SQL queries, helping you to avoid writing a specific repository method for each use case. This would be error-prone and doesn’t scale as the number of request parameters increases.
In addition to:
-
Writing dynamic SQL queries using Spring Data JPA Specification and Criteria
-
Writing dynamic SQL queries using Spring Data JPA repositories and EntityManager
you could also write dynamic queries using Spring Data JPA and Querydsl.
Querydsl is a framework that helps writing type-safe queries on top of JPA and other backend technologies, using a fluent API.
Spring Data JPA provides support for your repositories to use Querydsl via the QuerydslJpaPredicateExecutor fragment.
These are some of the methods this repository fragment provides:
findOne(Predicate predicate) |
findAll(Predicate predicate) |
findAll(Predicate predicate, Pageable pageable) |
and more.
You can combine multiple Querydsl Predicates, which generates dynamic WHERE
clause conditions.
But I didn’t find support to generate a dynamic number of JOIN
clauses. Adding unneeded JOIN
clauses to your SQL queries will impact the performance of your Spring Boot application or database.
This blog post covers how to extend Spring Data JPA for your repositories to access Querydsl objects so that you can write dynamic SQL queries.
Parsing CSV responses with a custom RestTemplate HttpMessageConverter
1. OVERVIEW
Even though RestTemplate has been deprecated in favor of WebClient, it’s still a very popular choice to integrate Java applications with in-house or third-party services.
If you find yourself working on application modernization you would most-likely need to integrate with legacy systems. Don’t be surprised if you get HTML, plain text, or CSV responses when integrating with legacy systems.
Of course you could use RestTemplate to get the response as a String and covert it to a Java object. But that’s not how you do it when retrieving JSON or XML responses.
You would only need:
ResponseEntity<Film> result = this.restTemplate.getForEntity(uri, Film.class);
and RestTemplate’s default HttpMessageConverters take care of the conversion.
This blog post helps you to write a custom RestTemplate HttpMessageConverter to convert CVS responses to Java objects.
Writing dynamic SQL queries using Spring Data JPA repositories and EntityManager
1. OVERVIEW
You would need to write dynamic SQL queries for instance, if you need to implement a RESTful endpoint like:
/api/films?category=Action&category=Comedy&category=Horror&minRentalRate=0.5&maxRentalRate=4.99&releaseYear=2006
where the request parameters category
, minRentalRate
, maxRentalRate
, and releaseYear
might be optional.
The resulting SQL query’s WHERE clause or even the number of table joins change based on the user input.
One option to write dynamic SQL queries in your Spring Data JPA repositories is to use Spring Data JPA Specification and Criteria API.
But Criteria queries are hard to read and write, specially complex queries. You might have tried to come up with the SQL query and reverse-engineer it to implement it using the Criteria API.
There are other options to write dynamic SQL or JPQL queries using Spring Data JPA.
This tutorial teaches you how to extend Spring Data JPA for your repositories to access the EntityManager so that you can write dynamic native SQL or JPQL queries.
Let’s start with a partial ER diagram for the db_dvdrental
relational database:
Fixing Hibernate HHH000104 firstResult maxResults warning using Spring Data JPA Specification and Criteria API
1. OVERVIEW
Whenever you use pagination and SQL joins to retrieve entities and their associations to prevent the N+1 select queries problem you’ll most-likely run into this Hibernate’s HHH000104
warning message.
HHH000104: firstResult/maxResults specified with collection fetch; applying in memory!
This warning is bad and will affect your application’s performance once your dataset grows. Let’s see why.
Let’s start with these tables relashionship:
It helps us to write or generate our domain model, and we would endup with these relevant JPA associated entities: