Mendix Ignite: Unit testing, Simplified training

Mitchel Mol
February 20, 2024

After two blog posts (how to get started & improvements to the module) and an awesome meetup (on-demand video), there is more unit testing goodness to share. In this edition of Mendix Ignite, I will give you simplified training on implementing the unit tests in your Mendix application. This training is based on the enhanced module I developed with the tech. team from eXp Realty. If you have not got access to it yet, make sure to contact me.

Core components of a unit test

Before building unit tests, it is good to know some core unit testing components.

Test Fixture

The test fixture initialises the test environment. This is part of the Unit testing module and provides the environment to manage your test(result)s and contains a runner to run your tests.

Test suites

Test suites are a wrapper to organise your test(result)s. A test suite is automatically created based on every module you have in your application.

Test Runner

The test runner executes the tests and ensures data variation is properly handled, results are stored, and any DB changes are rolled back. This is also part of the Unit testing module.


Is the validation that ensures the code behaves as expected. One unit test will have one or more assertions. If one fails, it will still execute the other assertions unless a fatal error occurs. In that case, the execution of the whole test is halted, and all changes are rolled back.

Fake objects/test data

These are objects with limited capabilities not meant for anything but use in the unit test. For example, if you have personal data in a Person object such as FirstName, LastName, DateOfBirth and Birthplace. But the code you are testing is to validate if the first name is filled. All you need to do is create a Person object and set the FirstName (or not in case of a failure validation).

Test coverage

Measures how much of the code is executed during testing. To indicate how much of your code is covered by unit tests. As the code coverage only counts all microflows per module and the once executed in that module as part of any unit tests executed, it isn’t an exact coverage report. For example, if you have multiple parameters and outcomes due to your microflow decisions, you would want to count all those paths and determine how many have been tested. But for most projects, this is a good start.

Common unit test types

To start with unit testing, it is good to know which types of logic are commonly covered with unit testing. Of course, it is not limited to this; many more details are missing, but it gives you a good idea of where to start.

Business logic

Generally speaking, this is the application’s core functionality. For example, calculations with lots of variations. Or decision trees leading to different outcomes.

Input validation

Handling invalid inputs gracefully, including edge cases and error handling, is important for a properly functioning application. It is preventing the garbage from finding its way into your application. Input validation is often also already isolated into VAL_ microflows, making them easy targets for unit tests. With the help of the Mendix feedback module, you can capture the validation feedback without having to go through the trouble of readjusting your application to support server-side and client-side execution of your validation logic.


APIs, including testing for different response codes, mapping correctness and error conditions. You want to validate both your published and consumed web services. You may not call a web service (unless you also have a mock service as part of the CI/CD pipeline) because you want to isolate the execution of your unit tests to just your application. But you may still test import/export mappings, handle a consumed service response or even the logic executed once a published service function is called. This includes any custom authentication microflow you may have added.

Identify a microflow suitable for unit testing (or create one)

Microflows most suitable for unit testing are microflows with a single purpose. For example, a validation flow that validates the first name of a Person object. It has one input parameter of the type object. The Person entity validates the validity of the FirstName attribute and triggers the validation feedback in case of an invalid/missing value. On top of that, it returns a Boolean value of true/false depending on the result.

The Unit test for this validation microflow would have at least 4 different variations:

  1. A valid filled-in FirstName attribute;
  2. An empty value for the FirstName attribute;
  3. An empty string value for the FirstName attribute;
  4. And no Person object passed at all.

This would be two unit tests or more complex data variation logic. 1-3 could be in one unit test where we always create the Person object and then set the FirstName attribute based on the data variation input parameter. The second unit test would test the behaviour when an empty Person object is passed. Essentially, it passes an empty value as an input parameter to the validation microflow.

So, to identify a microflow suitable for creating a unit test (or create one), you need to have a microflow with a single purpose. A clear output to be validated and input you may control. This could also be done by committing data to the database before executing the microflow as long as you clean up the data afterwards.

It is possible to test microflows with multiple purposes or even those that are called sub-microflows. The problem is that the amount of variations you should test increases exponentially. Either making your unit test(s) far more complex or maintaining them more difficult. These tests are component tests and should be implemented to augment the unit tests, not as a replacement.

Choosing the right prefix

The module supports three prefixes:

  • UT_ is used to identify unit tests that test actual single-purpose microflows, also known as units
  • TEST_ is used to identify unit tests that test component microflows.
  • UTH_ is used to identify microflows used in support of unit tests. By giving them this prefix, they will be excluded from the coverage calculation. The same goes for the other two prefixes.

AAA method

The implementation of unit tests always follows the AAA method: Arrange, Act and Assert.


During the Arrange you set up the data needed to execute the microflow you are testing. Remember that you need to fully control the data you create, so do not rely on any data that should have been entered into the system outside of the execution of the unit tests. If you do, you cannot guarantee that the data output from your tested microflow is wrong due to the microflow or the input data.

Input data usually means one or more objects are created and enhanced with data from the data variations. Sometimes, the microflow tested will retrieve data from the database (for example, a GetCreate microflow). In that case, you should commit the test data to the database and remove it again after executing the microflow. Even though the unit test module will revert all changes, it will only do that after fully implementing the test suite you are executing to allow for a setup microflow that does a generic setup for the whole test suite. Due to this, any data you commit for a specific test might interfere with other tests and need cleaning up after executing the microflow you are testing.

Arrange in a Unit test microflow


The act is the simple part of the unit test, as it just calls the microflow you are testing from your unit test microflow. Make sure you only call it once per unit test.

Act in a Unit test microflow


During the assert, you execute one or more assert microflow activities. This is to validate that the expected output has resulted from implementing the microflow you are testing.

The assert microflow activity consists of four attributes:

  • AssertionResult: this is the helper object created at the start of your unit test to keep track of the assertion results.
  • ValueToAssert: this attribute will contain the actual validation. As with any expression, needs to return a boolean that is then used to determine if the assertion is a success or failure.
  • AssertionDescription: via this attribute, you may describe the assertion; you should also include the value you are validating and what you expected for easier analysis of the results.
  • TestDescription: this is almost the same as the assertion description but, instead, in human language, explaining the assertion you are doing without any variables.
Assertion microflow activity
Assert in a Unit test microflow


To keep your unit tests simple and easy to maintain, there is an easy strategy to follow.

Execute the tested microflow only once per unit test microflow.

Only execute one microflow to test at a time per unit test microflow.

And separate success, error and invalid unit test outcomes unless you may easily implement it by using data variation and keep the rest of the unit test the same for each scenario.

Try to focus most of your attention (80%) on actual unit tests, testing microflows with a single purpose.

Determine unit test scenarios

Most developers are built to create new, exciting functionality. Most QA engineers are constructed to break that new exciting functionality before it breaks down on production. So, when determining your unit test scenarios, do this as a team effort. A good starting point would be during the refinement of the story. When the user scenarios are clear, sit with your QA engineering colleague(s) and determine the unit test scenarios. By doing this, you can create your unit tests early on. And you know what to test for when validating your new and exciting functionality. A good rule of thumb is to include scenarios for the happy path, invalid data path and full-on error path.

Data variation

To make your life easier, the enhanced unit testing module contains the option to use data variation. You may create one unit testing microflow using input parameters and assert value variation. The data variation is stored as JSON in your unit test microflow. When the unit testing module executes the unit test, it will retrieve this JSON string, parse it to DataVariation objects, and then execute the unit test microflow for each data variation.

The JSON string contains a description per variation to identify each data variation. Make sure to make this as descriptive as possible for easier understanding of the results.

JSON stored as part of the unit test

The InputParameters and AssertValues are easily retrievable from your unit testing microflow by calling the Get input parameter value from data variation and Get input assert value from data variation microflow activities, which are bundled with the Unit testing module.

Just make sure you match the name with the names from the JSON.

Get the input parameter with the supplied microflow activity
Get the assert value with the supplied microflow activity

To keep things flexible and simple, only simple datatypes are supported: boolean, datetime, decimal, integer, long and string. To create an object, you must make it in the microflow and fill its attributes with the input parameters or assert values. An enumeration value may be created either by parsing the string value into an actual enumeration or by creating a Java helper action that takes in a string input parameter and directly outputs that value as an enumeration from the right enumeration type. Of course, the string value used as an input parameter or assertion value must be the enumeration value's key.


If you have any questions or just want to get started implementing unit tests in your Mendix application, feel free to reach out to me or any of my colleagues at Blue Green Solutions , and we will answer your questions or get you started on a road to higher quality Mendix applications.

What is next?

Now that you know how to get started, that there is a better module out there and how to implement unit tests with the enhanced module, there is still the culture shift that is often required to actually get your team on board.

Next up, Eternal debate!

By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information.


What’s a Rich Text element?

The rich text element allows you to create and format headings, paragraphs, blockquotes, images, and video all in one place instead of having to add and format them individually. Just double-click and easily create content.

Static and dynamic content editing

A rich text element can be used with static or dynamic content. For static content, just drop it into any page and begin editing. For dynamic content, add a rich text field to any collection and then connect a rich text element to that field in the settings panel. Voila!

How to customize formatting for each rich text

Headings, paragraphs, blockquotes, figures, images, and figure captions can all be styled after a class is added to the rich text element using the "When inside of" nested selector system.

By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information.