This article is translated from: Building Reusable Mock Modules with Spring Boot - Reflectoring
Isn't it good to split the code base into loosely coupled modules, each with a set of special responsibilities?
This means that we can easily find each responsibility in the code base to add or modify code. It also means that the code base is easy to master, because we only need to load one module into the brain's working memory at a time.
Moreover, since each module has its own API, this means that we can create a reusable simulation for each module. When writing an integration test, we just need to import a simulation module and call its API to start the simulation. We no longer need to know every detail of the class we simulate.
In this article, we will focus on creating such a module, discuss why it is better to simulate an entire module than a single bean, and then introduce a simple but effective method to simulate a complete module for simple test setup using Spring Boot.
code example
Attached to this article On GitHub Working code example for.
What is a module?
When I talk about "modules" in this article, I mean:
A module is a highly cohesive set of classes with dedicated API s and a set of related responsibilities.
We can combine multiple modules into larger modules and finally form a complete application.
A module can use another module by calling its API.
You can also call them "components", but in this article, I will stick to "modules".
How to build modules?
When building applications, I recommend thinking ahead about how to modularize the code base. What are the natural boundaries in our code base?
Does our application need to communicate with external systems? This is a natural module boundary. We can build a module whose responsibility is to talk to external systems!
Do we determine the functional "boundary context" of the use cases that belong together? This is another good module boundary. We will build a module to implement the use cases in this functional part of the application!
Of course, there are more ways to split an application into modules, and it is often difficult to find the boundaries between them. They may even change over time! More importantly, we have a clear structure in our code base so that we can easily move concepts between modules!
To make the module visible in our code base, I recommend the following package structure:
- Each module has its own package
- Each module package has an api sub package that contains all classes exposed to other modules
Each module package has an internal sub package, which contains:
- All classes that implement the functions exposed by the API
- A Spring configuration class that provides bean s to the Spring application context required to implement the API
- Like Russian dolls, the internal sub package of each module may contain packages with sub modules, and each sub module has its own api and internal package
- Classes in a given internal package can only be accessed by classes in that package.
This makes the code base very clear and easy to navigate. Read more about this code structure in my section on clear architectural boundaries, or some of the code in the sample code.
This is a good package structure, but what does it have to do with testing and simulation?
What's the problem with simulating a single Bean?
As I said at the beginning, we want to focus on simulating the entire module rather than a single bean. But what's the problem with simulating a single bean first?
Let's look at a very common way to create integration tests using Spring Boot.
Suppose we want to write an integration test for the REST controller, which should create a repository on GitHub and send e-mail to users.
The integration test may be as follows:
@WebMvcTest class RepositoryControllerTestWithoutModuleMocks { @Autowired private MockMvc mockMvc; @MockBean private GitHubMutations gitHubMutations; @MockBean private GitHubQueries gitHubQueries; @MockBean private EmailNotificationService emailNotificationService; @Test void givenRepositoryDoesNotExist_thenRepositoryIsCreatedSuccessfully() throws Exception { String repositoryUrl = "https://github.com/reflectoring/reflectoring"; given(gitHubQueries.repositoryExists(...)).willReturn(false); given(gitHubMutations.createRepository(...)).willReturn(repositoryUrl); mockMvc.perform(post("/github/repository") .param("token", "123") .param("repositoryName", "foo") .param("organizationName", "bar")) .andExpect(status().is(200)); verify(emailNotificationService).sendEmail(...); verify(gitHubMutations).createRepository(...); } }
This test actually looks neat, and I've seen (and written) a lot of similar tests. But as people say, details determine success or failure.
We use the @ WebMvcTest annotation to set the Spring Boot application context to test the Spring MVC controller. The application context will contain all the bean s needed to make the controller work, that's all.
However, our controller requires some additional beans in the application context to work, namely GitHubMutations, GitHubQueries, and EmailNotificationService. Therefore, we add simulations of these beans to the application context through the @ MockBean annotation.
In the test method, we define these simulated states in a pair of given() statements, then invoke the controller endpoints that we want to test, then verify() uses some methods in the simulation up.
So what's wrong with this test? I thought of two main things:
First, to set the give () and verify() sections, the test needs to know which methods on the mock bean the controller is calling. This low-level knowledge of implementation details makes the test easy to modify. Every time the implementation details change, we must also update the test. This dilutes the value of testing and makes maintenance testing a chore rather than "sometimes routine".
Second, the @ MockBean annotation will cause Spring to create a new application context for each test (unless they have exactly the same fields). In a code base with multiple controllers, this will significantly increase test run time.
If we put a little effort into building the modular code base outlined in the previous section, we can address these two shortcomings by building reusable simulation modules.
Let's see how to implement it by looking at a specific example.
Modular Spring Boot application
OK, let's see how to use Spring Boots to implement reusable simulation modules.
This is the folder structure of the sample application. If you want to follow, you can On GitHub Code found:
├── github | ├── api | | ├── <I> GitHubMutations | | ├── <I> GitHubQueries | | └── <C> GitHubRepository | └── internal | ├── <C> GitHubModuleConfiguration | └── <C> GitHubService ├── mail | ├── api | | └── <I> EmailNotificationService | └── internal | ├── <C> EmailModuleConfiguration | ├── <C> EmailNotificationServiceImpl | └── <C> MailServer ├── rest | └── internal | └── <C> RepositoryController └── <C> DemoApplication
The application has three modules:
- The github module provides an interface to interact with the GitHub API,
- The mail module provides e-mail functions,
- The rest module provides a REST API to interact with applications.
Let's look at each module in more detail.
GitHub module
The github module provides two interfaces (marked with < I >) as part of its API:
- GitHubMutations provides some write operations to the GitHub API,
- GitHub queries, which provides some read operations to the GitHub API.
This is what the interface looks like:
public interface GitHubMutations { String createRepository(String token, GitHubRepository repository); } public interface GitHubQueries { List<String> getOrganisations(String token); List<String> getRepositories(String token, String organisation); boolean repositoryExists(String token, String repositoryName, String organisation); }
It also provides the class GitHubRepository for signing these interfaces.
Internally, the github module has a class GitHubService, which implements two interfaces, and a class GitHubModuleConfiguration, which is a Spring configuration and contributes a GitHubService instance to the application context:
@Configuration class GitHubModuleConfiguration { @Bean GitHubService gitHubService() { return new GitHubService(); } }
Since GitHubService implements the entire API of github module, this bean is sufficient to make the API of this module available to other modules in the same Spring Boot application.
Mail module
The mail module is built in a similar way. Its API consists of a single interface EmailNotificationService:
public interface EmailNotificationService { void sendEmail(String to, String subject, String text); }
This interface is implemented by the internal beanEmailNotificationServiceImpl.
Note that the naming convention I use in the mail module is different from that used in the github module. The github module has an internal class ending with * Servicee, while the mail module has a * Service class as part of its API. Although the github module does not use the ugly * Impl suffix, the mail module does.
I did this deliberately to make the code more realistic. Have you ever seen a code base (not written by yourself) use the same naming convention everywhere? I didn't.
However, if you build modules like we did in this article, it doesn't really matter. Because the ugly * Impl class is hidden behind the module API.
Internally, the mail module has an EmailModuleConfiguration class, which provides an API implementation for the Spring application context:
@Configuration class EmailModuleConfiguration { @Bean EmailNotificationService emailNotificationService() { return new EmailNotificationServiceImpl(); } }
REST module
The REST module consists of a single REST controller:
@RestController class RepositoryController { private final GitHubMutations gitHubMutations; private final GitHubQueries gitHubQueries; private final EmailNotificationService emailNotificationService; // constructor omitted @PostMapping("/github/repository") ResponseEntity<Void> createGitHubRepository(@RequestParam("token") String token, @RequestParam("repositoryName") String repoName, @RequestParam("organizationName") String orgName) { if (gitHubQueries.repositoryExists(token, repoName, orgName)) { return ResponseEntity.status(HttpStatus.BAD_REQUEST).build(); } String repoUrl = gitHubMutations.createRepository(token, new GitHubRepository(repoName, orgName)); emailNotificationService.sendEmail("user@mail.com", "Your new repository", "Here's your new repository: " + repoUrl); return ResponseEntity.ok().build(); } }
The controller calls the API of GitHub module to create a GitHub warehouse, and then sends mail through the API of mail module to let users know the new warehouse.
Analog GitHub module
Now let's look at how to build a reusable simulation for the github module. We created a @ TestConfiguration class, which provides all bean s of the module API:
@TestConfiguration public class GitHubModuleMock { private final GitHubService gitHubServiceMock = Mockito.mock(GitHubService.class); @Bean @Primary GitHubService gitHubServiceMock() { return gitHubServiceMock; } public void givenCreateRepositoryReturnsUrl(String url) { given(gitHubServiceMock.createRepository(any(), any())).willReturn(url); } public void givenRepositoryExists() { given(gitHubServiceMock.repositoryExists(anyString(), anyString(), anyString())).willReturn(true); } public void givenRepositoryDoesNotExist() { given(gitHubServiceMock.repositoryExists(anyString(), anyString(), anyString())).willReturn(false); } public void assertRepositoryCreated() { verify(gitHubServiceMock).createRepository(any(), any()); } public void givenDefaultState(String defaultRepositoryUrl) { givenRepositoryDoesNotExist(); givenCreateRepositoryReturnsUrl(defaultRepositoryUrl); } public void assertRepositoryNotCreated() { verify(gitHubServiceMock, never()).createRepository(any(), any()); } }
In addition to providing a simulated GitHubService bean, we also added a bunch of given * () and assert * () methods to this class.
The given given given * () method allows us to set the simulation to the desired state, while the verify * () method allows us to check whether the interaction with the simulation occurs after running the test.
@The Primary annotation ensures that simulation takes precedence if both simulated and real bean s are loaded into the application context.
Simulated Email module
We built a very similar simulation configuration for the mail module:
@TestConfiguration public class EmailModuleMock { private final EmailNotificationService emailNotificationServiceMock = Mockito.mock(EmailNotificationService.class); @Bean @Primary EmailNotificationService emailNotificationServiceMock() { return emailNotificationServiceMock; } public void givenSendMailSucceeds() { // nothing to do, the mock will simply return } public void givenSendMailThrowsError() { doThrow(new RuntimeException("error when sending mail")).when(emailNotificationServiceMock) .sendEmail(anyString(), anyString(), anyString()); } public void assertSentMailContains(String repositoryUrl) { verify(emailNotificationServiceMock).sendEmail(anyString(), anyString(), contains(repositoryUrl)); } public void assertNoMailSent() { verify(emailNotificationServiceMock, never()).sendEmail(anyString(), anyString(), anyString()); } }
Use the simulation module in the test
Now, with the simulation modules, we can use them in the integration test of the controller:
@WebMvcTest @Import({ GitHubModuleMock.class, EmailModuleMock.class }) class RepositoryControllerTest { @Autowired private MockMvc mockMvc; @Autowired private EmailModuleMock emailModuleMock; @Autowired private GitHubModuleMock gitHubModuleMock; @Test void givenRepositoryDoesNotExist_thenRepositoryIsCreatedSuccessfully() throws Exception { String repositoryUrl = "https://github.com/reflectoring/reflectoring.github.io"; gitHubModuleMock.givenDefaultState(repositoryUrl); emailModuleMock.givenSendMailSucceeds(); mockMvc.perform(post("/github/repository").param("token", "123").param("repositoryName", "foo") .param("organizationName", "bar")).andExpect(status().is(200)); emailModuleMock.assertSentMailContains(repositoryUrl); gitHubModuleMock.assertRepositoryCreated(); } @Test void givenRepositoryExists_thenReturnsBadRequest() throws Exception { String repositoryUrl = "https://github.com/reflectoring/reflectoring.github.io"; gitHubModuleMock.givenDefaultState(repositoryUrl); gitHubModuleMock.givenRepositoryExists(); emailModuleMock.givenSendMailSucceeds(); mockMvc.perform(post("/github/repository").param("token", "123").param("repositoryName", "foo") .param("organizationName", "bar")).andExpect(status().is(400)); emailModuleMock.assertNoMailSent(); gitHubModuleMock.assertRepositoryNotCreated(); } }
We use the @ Import annotation to Import the simulation into the application context.
Note that the @ WebMvcTest annotation also causes the actual module to be loaded into the application context. This is why we use the @ Primary annotation on the simulation to give priority to the simulation.
How to handle modules with abnormal behavior?
The module may attempt to connect to some external services during startup and behave abnormally. For example, the mail module might create an SMTP connection pool at startup. This naturally fails when no SMTP server is available. This means that when we load the module in the integration test, the start of the Spring context will fail.
To make the module perform better during testing, we can introduce a configuration property mail.enabled. Then, we annotate the module's configuration class with @ ConditionalOnProperty to tell Spring not to load the configuration if the property is set to false.
Now, during the test, only the simulation module is loaded.
Instead of simulating a specific method call in the test, we call the prepared given * () method in the simulation module. This means that the test no longer requires the internal knowledge of the class called by the test object.
After executing the code, we can use the prepared verify * () method to verify whether the repository has been created or the mail has been sent. Similarly, the specific underlying method calls are not known.
If we need a github or mail module in another controller, we can use the same simulation module in the test of that controller.
If we later decide to build another integration that uses a real version of some modules but a simulated version of other modules, we only need to use a few @ Import annotations to build the application context we need.
This is the whole idea of the module: we can use the simulation of real module A and module B, and we still have A working application that can run the test.
The simulation module is the center of our simulation behavior in this module. They can translate high-level simulation expectations such as "make sure you can create a repository" into low-level calls to API bean simulation.
conclusion
By consciously understanding what is part of the module API and what is not, we can build an appropriate modular code base with few unnecessary dependencies.
Since we know what is part of the API and what is not, we can build a dedicated simulation for each module's API. We don't care about the inside, we're just simulating the API.
Simulation modules can provide APIs to simulate certain states and verify certain interactions. By using the API of the simulation module instead of simulating each individual method call, our integration tests become more flexible to adapt to changes.