Unit and Integration Tests

Let’s re­fresh our mind about what we de­vel­oped so far in the in­tro­duc­tion to vert.x se­ries. In the first post, we de­vel­oped a very sim­ple Vert.x 3 ap­pli­ca­tion, and saw how this ap­pli­ca­tion can be tested, pack­aged and ex­e­cuted. In the sec­ond post, we saw how this ap­pli­ca­tion be­came con­fig­urable and how we can use a ran­dom port in test, and use an­other con­fig­urable port in pro­duc­tion. Fi­nally, the pre­vi­ous post has shown how to use vertx-​web and how to im­ple­ment a small REST API. How­ever, we for­got an im­por­tant task. We didn’t test the API. In this post we will in­crease the con­fi­dence we have on this ap­pli­ca­tion by im­ple­ment­ing unit and in­te­gra­tion tests.

The code of this post is avail­able in the post-4 branch of the project. The start­ing post, how­ever is the code avail­able in the post-3 branch.

Tests, Tests, Tests…

This post is mainly about tests. We dis­tin­guish two types of tests: unit tests and in­te­gra­tion tests. Both are equally im­por­tant, but have dif­fer­ent focus. Unit tests en­sure that one com­po­nent of your ap­pli­ca­tion, gen­er­ally a class in the Java world, be­haves as ex­pected. The ap­pli­ca­tion is not tested as a whole, but pieces by pieces. In­te­gra­tion tests are more black box in the sense that the ap­pli­ca­tion is started and tested gen­er­ally ex­ter­nally.

In this post we are going to start with some more unit tests as a warm up ses­sion and then focus on in­te­gra­tion tests. If you al­ready im­ple­mented in­te­gra­tion tests, you may be a bit scared, and it makes sense. But don’t worry, with Vert.x there are no hid­den sur­prises.

Warmup: Some more unit tests

Let’s start slowly. Re­mem­ber in the first post we have im­ple­mented a unit test with vertx-​unit. The test we did is dead sim­ple:

  1. we started the ap­pli­ca­tion be­fore the test
  2. we checks that it replies “Hello”

Just to re­fresh your mind, let’s have a look at the code

@Before
public void setUp(TestContext context) throws IOException {
  vertx = Vertx.vertx();
  ServerSocket socket = new ServerSocket(0);
  port = socket.getLocalPort();
  socket.close();
  DeploymentOptions options = new DeploymentOptions()
      .setConfig(new JsonObject().put("http.port", port)
      );
  vertx.deployVerticle(MyFirstVerticle.class.getName(), options, context.asyncAssertSuccess());
}

The setUp method is in­voked be­fore each test (as in­structed by the @Before an­no­ta­tion). It, first, cre­ates a new in­stance of Vert.x. Then, it gets a free port and then de­ploys our ver­ti­cle with the right con­fig­u­ra­tion. Thanks to the context.asyncAssertSuccess() it waits until the suc­cess­ful de­ploy­ment of the ver­ti­cle.

The tearDown is straight­for­ward and just closes the Vert.x in­stance. It au­to­mat­i­cally un-​deploys the ver­ti­cles:

@After
public void tearDown(TestContext context) {
  vertx.close(context.asyncAssertSuccess());
}

Fi­nally, our sin­gle test is:

@Test
public void testMyApplication(TestContext context) {
  final Async async = context.async();
  vertx.createHttpClient().getNow(port, "localhost", "/", response -> {
    response.handler(body -> {
      context.assertTrue(body.toString().contains("Hello"));
      async.complete();
    });
  });
 }

It is only check­ing that the ap­pli­ca­tion replies “Hello” when we emit a HTTP re­quest on /.

Let’s now try to im­ple­ment some unit tests checkin that our web ap­pli­ca­tion and the REST API be­have as ex­pected. Let’s start by check­ing that the index.html page is cor­rectly served. This test is very sim­i­lar to the pre­vi­ous one:

@Test
public void checkThatTheIndexPageIsServed(TestContext context) {
  Async async = context.async();
  vertx.createHttpClient().getNow(port, "localhost", "/assets/index.html", response -> {
    context.assertEquals(response.statusCode(), 200);
    context.assertEquals(response.headers().get("content-type"), "text/html");
    response.bodyHandler(body -> {
      context.assertTrue(body.toString().contains("<title>My Whisky Collection</title>"));
      async.complete();
    });
  });
}

We re­trieve the index.html page and check:

  1. it’s there (sta­tus code 200)
  2. it’s a HTML page (con­tent type set to “text/html”)
  3. it has the right title (“My Whisky Col­lec­tion”)

As you can see, we can test the sta­tus code and the head­ers di­rectly on the HTTP re­sponse, but en­sure that the body is right, we need to re­trieve it. This is done with a body han­dler that re­ceives the com­plete body as pa­ra­me­ter. Once the last check is made, we re­lease the async by call­ing complete.

Ok, great, but this ac­tu­ally does not test our REST API. Let’s en­sure that we can add a bot­tle to the col­lec­tion. Un­like the pre­vi­ous tests, this one is using post to post data to the server:

@Test
public void checkThatWeCanAdd(TestContext context) {
  Async async = context.async();
  final String json = Json.encodePrettily(new Whisky("Jameson", "Ireland"));
  final String length = Integer.toString(json.length());
  vertx.createHttpClient().post(port, "localhost", "/api/whiskies")
      .putHeader("content-type", "application/json")
      .putHeader("content-length", length)
      .handler(response -> {
        context.assertEquals(response.statusCode(), 201);
        context.assertTrue(response.headers().get("content-type").contains("application/json"));
        response.bodyHandler(body -> {
          final Whisky whisky = Json.decodeValue(body.toString(), Whisky.class);
          context.assertEquals(whisky.getName(), "Jameson");
          context.assertEquals(whisky.getOrigin(), "Ireland");
          context.assertNotNull(whisky.getId());
          async.complete();
        });
      })
      .write(json)
      .end();
}

First we cre­ate the con­tent we want to add. The server con­sumes JSON data, so we need a JSON string. You can ei­ther write your JSON doc­u­ment man­u­ally, or use the Vert.x method (Json.encodePrettily) as done here. Once we have the con­tent, we cre­ate a post re­quest. We need to con­fig­ure some head­ers to be cor­rectly read by the server. First, we say that we are send­ing JSON data and we also set the con­tent length. We also at­tach a re­sponse han­dler very close to the checks made in the pre­vi­ous test. No­tice that we can re­build our ob­ject from the JSON doc­u­ment send by the server using the JSON.decodeValue method. It’s very con­ve­nient as it avoids lots of boil­er­plate code. At this point the re­quest is not emit­ted, we need to write the data and call the end() method. This is made using .write(json).end();.

The order of the meth­ods is im­por­tant. You can­not write data if you don’t have a re­sponse han­dler con­fig­ured. Fi­nally don’t for­get to call end.

So, let’s try this. You can run the test using:

mvn clean test

We could con­tinue writ­ing more unit test like that, but it could be­come quite com­plex. Let’s see how we could con­tinue our tests using in­te­gra­tion tests.

IT hurts

Well, I think we need to make that clear, in­te­gra­tion test­ing hurts. If you have ex­pe­ri­ence in this area, can you re­mem­ber how long did it take to setup every­thing cor­rectly? I get new white hairs by just think­ing about it. Why are in­te­gra­tion tests more com­pli­cated? It’s ba­si­cally be­cause of the setup:

  1. We must start the ap­pli­ca­tion in a close to pro­duc­tion way
  2. We must then run the tests (and con­fig­ure them to hit the right ap­pli­ca­tion in­stance)
  3. We must stop the ap­pli­ca­tion

That does not sound un­con­quer­able like that, but if you need Linux, MacOS X and Win­dows sup­port, it quickly get messy. There are plenty of great frame­works eas­ing this such as Ar­quil­lian, but let’s do it with­out any frame­work to un­der­stand how it works.

We need a battle plan

Be­fore rush­ing into the com­plex con­fig­u­ra­tion, let’s think a minute about the tasks:

Step 1 - Re­serve a free port We need to get a free port on which the ap­pli­ca­tion can lis­ten, and we need to in­ject this port in our in­te­gra­tion tests.

Step 2 - Gen­er­ate the ap­pli­ca­tion con­fig­u­ra­tion Once we have the free port, we need to write a JSON file con­fig­ur­ing the ap­pli­ca­tion HTTP Port to this port.

Step 3 - Start the ap­pli­ca­tion Sounds easy right? Well it’s not that sim­ple as we need to launch our ap­pli­ca­tion in a back­ground process.

Step 4 - Ex­e­cute the in­te­gra­tion tests Fi­nally, the cen­tral part, run the tests. But be­fore that we should im­ple­ment some in­te­gra­tion tests. Let’s come to that later.

Step 5 - Stop the ap­pli­ca­tion Once the tests have been ex­e­cuted, re­gard­less if there are fail­ures or er­rors in the tests, we need to stop the ap­pli­ca­tion.

There are mul­ti­ple way to im­ple­ment this plan. We are going to use a generic way. It’s not nec­es­sar­ily the bet­ter, but can be ap­plied al­most every­where. The ap­proach is tight to Apache Maven. If you want to pro­pose an al­ter­na­tive using Gra­dle or a dif­fer­ent tool, I will be happy to add your way to the post.

Implement the plan

As said above, this sec­tion is Maven-​centric, and most of the code goes in the pom.xml file. If you never used the dif­fer­ent Maven life­cy­cle phases, I rec­om­mend you to look at the in­tro­duc­tion to the Maven life­cy­cle.

We need to add and con­fig­ure a cou­ple of plu­g­ins. Open the pom.xml file and in the <plugins> sec­tion add:

<plugin>
  <groupId>org.codehaus.mojo</groupId>
  <artifactId>build-helper-maven-plugin</artifactId>
  <version>1.9.1</version>
  <executions>
    <execution>
      <id>reserve-network-port</id>
      <goals>
        <goal>reserve-network-port</goal>
      </goals>
      <phase>process-sources</phase>
      <configuration>
        <portNames>
          <portName>http.port</portName>
        </portNames>
      </configuration>
    </execution>
  </executions>
</plugin>

We use the build-helper-maven-plugin (a plug­in to know if you are often using Maven) to pick up a free port. Once found, the plug­in as­signs the http.port vari­able to the picked port. We ex­e­cute this plug­in early in the build (dur­ing the process-sources phase), so we can use the http.port vari­able in the other plug­in. This was for the first step.

Two ac­tions are re­quired for the sec­ond step. First, in the pom.xml file, just below the <build> open­ing tag, add:

<testResources>
  <testResource>
    <directory>src/test/resources</directory>
    <filtering>true</filtering>
  </testResource>
</testResources>

This in­structs Maven to fil­ter re­sources from the src/test/resources di­rec­tory. Fil­ter means re­plac­ing place­hold­ers by ac­tual val­ues. That’s ex­actly what we need as we now have the http.port vari­able. So cre­ate the src/test/resources/my-it-config.json file with the fol­low­ing con­tent:

{
  "http.port": ${http.port}
}

This con­fig­u­ra­tion file is sim­i­lar to the one we did in pre­vi­ous posts. The only dif­fer­ence is the ${http.port} which is the (de­fault) Maven syn­tax for fil­ter­ing. So, when Maven is going to process or file it will re­place ${http.port} by the se­lected port. That’s all for the sec­ond step.

The step 3 and 5 are a bit more tricky. We should start and stop the ap­pli­ca­tion. We are going to use the maven-antrun-plugin to achieve this. In the pom.xml file, below the build-helper-maven-plugin, add:

<!-- We use the maven-antrun-plugin to start the application before the integration tests
and stop them afterward -->
<plugin>
  <artifactId>maven-antrun-plugin</artifactId>
  <version>1.8</version>
  <executions>
    <execution>
      <id>start-vertx-app</id>
      <phase>pre-integration-test</phase>
      <goals>
        <goal>run</goal>
      </goals>
      <configuration>
        <target>
          <!--
          Launch the application as in 'production' using the fatjar.
          We pass the generated configuration, configuring the http port to the picked one
          -->
          <exec executable="${java.home}/bin/java"
                dir="${project.build.directory}"
                spawn="true">
            <arg value="-jar"/>
            <arg value="${project.artifactId}-${project.version}-fat.jar"/>
            <arg value="-conf"/>
            <arg value="${project.build.directory}/test-classes/my-it-config.json"/>
          </exec>
        </target>
      </configuration>
    </execution>
    <execution>
      <id>stop-vertx-app</id>
      <phase>post-integration-test</phase>
      <goals>
        <goal>run</goal>
      </goals>
      <configuration>
        <!--
          Kill the started process.
          Finding the right process is a bit tricky. Windows command in in the windows profile (below)
          -->
        <target>
          <exec executable="bash"
                dir="${project.build.directory}"
                spawn="false">
            <arg value="-c"/>
            <arg value="ps ax | grep -Ei '[\-]DtestPort=${http.port}\s+\-jar\s+${project.artifactId}' | awk 'NR==1{print $1}' | xargs kill -SIGTERM"/>
          </exec>
        </target>
      </configuration>
    </execution>
  </executions>
</plugin>

That’s a huge piece of XML, isn’t it ? We con­fig­ure two ex­e­cu­tions of the plug­in. The first one, hap­pen­ing in the pre-integration-test phase, ex­e­cutes a set of bash com­mand to start the ap­pli­ca­tion. It ba­si­cally ex­e­cutes:

java -jar my-first-app-1.0-SNAPSHOT-fat.jar -conf .../my-it-config.json
Is the fatjar created?

The fat jar em­bed­ding our ap­pli­ca­tion is cre­ated in the package phase, pre­ced­ing the pre-integration-test, so yes, the fat jar is cre­ated.

As men­tioned above, we launch the ap­pli­ca­tion as we would in a pro­duc­tion en­vi­ron­ment.

Once, the in­te­gra­tion tests are ex­e­cuted (step 4 we didn’t look at it yet), we need to stop the ap­pli­ca­tion (so in the the post-integration-test phase). To close the ap­pli­ca­tion, we are going to in­voke some shell magic com­mand to find our process in with the ps com­mand and send the SIGTERM sig­nal. It is equiv­a­lent to:

ps
.... -> find your process id
kill your_process_id -SIGTERM
And Windows?

I men­tioned it above, we want Win­dows to be sup­ported and these com­mands are not going to work on Win­dows. Don’t worry, Win­dows con­fig­u­ra­tion is below …

We should now do the fourth step we (silently) skipped. To ex­e­cute our in­te­gra­tion tests, we use the maven-failsafe-plugin. Add the fol­low­ing plug­in con­fig­u­ra­tion to your pom.xml file:

<plugin>
  <groupId>org.apache.maven.plugins</groupId>
  <artifactId>maven-failsafe-plugin</artifactId>
  <version>2.18.1</version>
  <executions>
    <execution>
      <goals>
        <goal>integration-test</goal>
        <goal>verify</goal>
      </goals>
      <configuration>
        <systemProperties>
          <http.port>${http.port}</http.port>
        </systemProperties>
      </configuration>
    </execution>
  </executions>
</plugin>

As you can see, we pass the http.port prop­erty as a sys­tem vari­able, so our tests are able to con­nect on the right port.

That’s all! Wow… Let’s try this (for win­dows users, you will need to be pa­tient or to jump to the last sec­tion).

mvn clean verify

We should not use mvn integration-test be­cause the ap­pli­ca­tion would still be run­ning. The verify phase is after the post-integration-test phase and will analyse the integration-​tests re­sults. Build fail­ures be­cause of in­te­gra­tion tests failed as­ser­tions are re­ported in this phase.

Hey, we don’t have integration tests !

And that’s right, we set up every­thing, but we don’t have a sin­gle in­te­gra­tion test. To ease the im­ple­men­ta­tion, let’s use two li­braries: As­sertJ and Rest-​Assured.

As­sertJ pro­poses a set of as­ser­tions that you can chain and use flu­ently. Rest As­sured is a frame­work to test REST API.

In the pom.xml file, add the two fol­low­ing de­pen­den­cies just be­fore </dependencies>:

<dependency>
  <groupId>com.jayway.restassured</groupId>
  <artifactId>rest-assured</artifactId>
  <version>2.4.0</version>
  <scope>test</scope>
</dependency>
<dependency>
  <groupId>org.assertj</groupId>
  <artifactId>assertj-core</artifactId>
  <version>2.0.0</version>
  <scope>test</scope>
</dependency>

Then, cre­ate the src/test/java/io/vertx/blog/first/MyRestIT.java file. Un­like unit test, in­te­gra­tion test ends with IT. It’s a con­ven­tion from the Fail­safe plug­in to dis­tin­guish unit (start­ing or end­ing with Test) from in­te­gra­tion tests (start­ing or end­ing with IT). In the cre­ated file add:

package io.vertx.blog.first;

import com.jayway.restassured.RestAssured;
import org.junit.AfterClass;
import org.junit.BeforeClass;

public class MyRestIT {

  @BeforeClass
  public static void configureRestAssured() {
    RestAssured.baseURI = "http://localhost";
    RestAssured.port = Integer.getInteger("http.port", 8080);
  }

  @AfterClass
  public static void unconfigureRestAssured() {
    RestAssured.reset();
  }
}

The meth­ods an­no­tated with @BeforeClass and @AfterClass are in­voked once be­fore / after all tests of the class. Here, we just re­trieve the http port (passed as a sys­tem prop­erty) and we con­fig­ure REST As­sured.

Am I ready to serve?

You may need to wait in the configureRestAssured method that the HTTP server has been started. We rec­om­mend the await­il­ity test frame­work to check that the re­quest can be served. It would fail the test if the server does not start.

It’s now time to im­ple­ment a real test. Let’s check we can re­trieve an in­di­vid­ual prod­uct:

@Test
public void checkThatWeCanRetrieveIndividualProduct() {
  // Get the list of bottles, ensure it's a success and extract the first id.
  final int id = get("/api/whiskies").then()
      .assertThat()
      .statusCode(200)
      .extract()
      .jsonPath().getInt("find { it.name=='Bowmore 15 Years Laimrig' }.id");
  // Now get the individual resource and check the content
  get("/api/whiskies/" + id).then()
      .assertThat()
      .statusCode(200)
      .body("name", equalTo("Bowmore 15 Years Laimrig"))
      .body("origin", equalTo("Scotland, Islay"))
      .body("id", equalTo(id));
}

Here you can ap­pre­ci­ate the power and ex­pres­sive­ness of Rest As­sured. We re­trieve the list of prod­uct, en­sure the re­sponse is cor­rect, and ex­tract the id of a spe­cific bot­tle using a JSON (Groovy) Path ex­pres­sion.

Then, we try to re­trieve the meta­data of this in­di­vid­ual prod­uct, and check the re­sult.

Let’s now im­ple­ment a more so­phis­ti­cated sce­nario. Let’s add and delete a prod­uct:

@Test
public void checkWeCanAddAndDeleteAProduct() {
  // Create a new bottle and retrieve the result (as a Whisky instance).
  Whisky whisky = given()
      .body("{\"name\":\"Jameson\", \"origin\":\"Ireland\"}").request().post("/api/whiskies").thenReturn().as(Whisky.class);
  assertThat(whisky.getName()).isEqualToIgnoringCase("Jameson");
  assertThat(whisky.getOrigin()).isEqualToIgnoringCase("Ireland");
  assertThat(whisky.getId()).isNotZero();
  // Check that it has created an individual resource, and check the content.
  get("/api/whiskies/" + whisky.getId()).then()
      .assertThat()
      .statusCode(200)
      .body("name", equalTo("Jameson"))
      .body("origin", equalTo("Ireland"))
      .body("id", equalTo(whisky.getId()));
  // Delete the bottle
  delete("/api/whiskies/" + whisky.getId()).then().assertThat().statusCode(204);
  // Check that the resource is not available anymore
  get("/api/whiskies/" + whisky.getId()).then()
      .assertThat()
      .statusCode(404);
}

So, now we have in­te­gra­tion tests let’s try:

mvn clean verify

Sim­ple no? Well, sim­ple once the setup is done right… You can con­tinue im­ple­ment­ing other in­te­gra­tion tests to be sure that every­thing be­have as you ex­pect.

Dear Windows users…

This sec­tion is the bonus part for Win­dows user, or peo­ple want­ing to run their in­te­gra­tion tests on Win­dows ma­chine too. The com­mand we ex­e­cute to stop the ap­pli­ca­tion is not going to work on Win­dows. Luck­ily, it’s pos­si­ble to ex­tend the pom.xml with a pro­file ex­e­cuted on Win­dows.

In your pom.xml, just after </build>, add:

<profiles>
  <!-- A profile for windows as the stop command is different -->
  <profile>
    <id>windows</id>
    <activation>
      <os>
        <family>windows</family>
      </os>
    </activation>
    <build>
      <plugins>
        <plugin>
          <artifactId>maven-antrun-plugin</artifactId>
          <version>1.8</version>
          <executions>
            <execution>
              <id>stop-vertx-app</id>
              <phase>post-integration-test</phase>
              <goals>
                <goal>run</goal>
              </goals>
              <configuration>
                <target>
                  <exec executable="wmic"
                      dir="${project.build.directory}"
                      spawn="false">
                    <arg value="process"/>
                    <arg value="where"/>
                    <arg value="CommandLine like '%${project.artifactId}%' and not name='wmic.exe'"/>
                    <arg value="delete"/>
                  </exec>
                </target>
              </configuration>
            </execution>
          </executions>
        </plugin>
      </plugins>
    </build>
  </profile>
</profiles>

This pro­file re­places the ac­tions de­scribed above to stop the ap­pli­ca­tion with a ver­sion work­ing on win­dows. The pro­file is au­to­mat­i­cally en­abled on Win­dows. As on oth­ers op­er­at­ing sys­tems, ex­e­cute with:

mvn clean verify

Conclusion

Wow, what a trip ! We are done… In this post we have seen how we can gain con­fi­dence in Vert.x ap­pli­ca­tions by im­ple­ment­ing both unit and in­te­gra­tion tests. Unit tests, thanks to vert.x unit, are able to check the asyn­chro­nous as­pect of Vert.x ap­pli­ca­tion, but could be com­plex for large sce­nar­ios. Thanks to Rest As­sured and As­sertJ, in­te­gra­tion tests are dead sim­ple to write… but the setup is not straight­for­ward. This post have shown how it can be con­fig­ured eas­ily. Ob­vi­ously, you could also use As­sertJ and Rest As­sured in your unit tests.

In the next post, we re­place the in mem­ory back­end with a data­base, and use asyn­chro­nous in­te­gra­tion with this data­base.

Stay Tuned & Happy Cod­ing !

Next post

Vert.x 3 real time web apps

One of the interesting features of Vert.x is the SockJS event bus bridge. It allows external applications to communicate with Vert.x event bus using Websockets.

Read more
Previous post

Some Rest with Vert.x

This post is part of the Introduction to Vert.x series. Let’s go a bit further this time and develop a CRUD-ish application

Read more
Related posts

Some Rest with Vert.x

This post is part of the Introduction to Vert.x series. Let’s go a bit further this time and develop a CRUD-ish application

Read more

My first Vert.x 3 Application

Let's say, you heard someone saying that Vert.x is awesome. Ok great, but you may want to try it by yourself. Well, the next natural question is “where do I start ?”

Read more

Using the asynchronous SQL client

Finally, back... This post is the fifth post of the introduction to vert.x blog series, after a not-that-small break. In this post we are going to see how we can use JDBC in a vert.x application.

Read more